Content Testing - Bringing Science to the Business of Creativity with Shailesh Kapoor
What goes into content testing for a film or web series?
In the first episode of The Media Room, media expert and author Vanita Kohli-Khandekar sits down with Shailesh Kapoor, CEO of Ormax Media, to explore the interplay between data and creativity in the entertainment industry. Kapoor discusses in detail how Ormax Media uses its data models (which transformed post-pandemic) to help in both the pre-production and post-production phases of the content pipeline. He shares insights into Ormax’s research methods, including audience profiling, and how these tools have shaped marketing strategies for films and television. Kapoor also opens up about the challenges of navigating skepticism in the creative community and the importance of accuracy in audience analytics. Tune in for an inside look at how data is reshaping the future of Indian entertainment.
NOTE: This transcript contains transcripts by a machine. Human eyes have gone through the script but there might still be errors in some of the text, so please refer to the audio in case you need to clarify any part. If you want to get in touch regarding any feedback, you can drop us a message on [email protected].
---
TRANSCRIPT
Vanita Kohli-Khandekar: Hello and Welcome to The Media Room... your place to know all there is about the business of media and entertainment
Today we are going to talk about a relatively, new but important element, in the Indian film business – the testing of scripts, films, series and TV shows. Could you have predicted that Laapata Ladies or Manjummel Boys would work? Or that audiences would love 12th Fail one of the most successful films of 2023. Content testing helps a film or show in many ways – it informs the marketing on the audiences and markets to target. It helps the creative folk tweak things if needed. Hollywood and many European cinemas have been testing scripts and films for ages. There are test screenings and focus groups, that have led to changes in plots, characters et al. In Los Angeles, which houses the biggest film industry in the world, Hollywood, Kevin Gotez’s Screen Engine has been a big force in testing. In India that place has come to be occupied by Ormax media which began testing story ideas, scripts and fully made films and series from 2008.
I had a long chat with Shailesh Kapoor, co-founder and CEO of Ormax on the ifs, buts, whys and wherefores of ‘content’ testing.. Incidentally many creators and writers, including me, do not like to use the word content.. it dehumanises the whole creative aspect.. As far as possible we will use the word storytelling.
—
Vanita Kohli-Khandekar: Hi, Shailesh, welcome to the show. And thank you for joining us. And for everybody who doesn't know, Shailesh Kapoor runs Ormax Media.
Shailesh, do you really, can you really predict whether a film is going to be a success or a failure?
Shailesh Kapoor: Hi, Vanita, good to be on your show. Coming to your question, you see, of course, if you could predict everything to the second decimal place and all of that, then, you know, we won't be talking here. I'll probably be already retired.
And, you know, I would have bought some islands or some countries. But the point is that somebody told me a few years ago that, you know, if you can broadly predict most of the things within a certain range, it is still a very good skill to have. So I think when we say prediction, there's no absolute exact number.
You have to, like, normally, if you look at box office and things like that, within 10, 15% is fairly accurate. And I think that's what the attempt is that rather than not attempting to predict, one does it and over time, one gets better at it. So compared to what we would have been five years ago or seven years ago and now, I think there's a huge difference.
And as long as the accuracy is improving with time, it's a good sign. And that's what we look at, the trajectory of the accuracy itself.
Vanita Kohli-Khandekar: Fantastic. Now, you know, I think we sort of jumped into the success and failure, but first I want you to tell us what is, and I'm using the word content with great hesitation here. I prefer to use the word film or script or series testing all about.
What do you do when you test it? What is the point of it all and how is it done?
Shailesh Kapoor: Yeah.
Vanita Kohli-Khandekar: And has it been successful? So, you know, you will have to take us in steps through all of this.
Shailesh Kapoor: Yes, so a lot of questions there. So, see, we started doing it in 2008. So in 16 years, obviously we've tried to grow this market.
We, as of now, we have finished 1012 testing projects across domains, linear television, streaming, and theatrical, and across languages. Some are final product, which is video, some are screenplays, and some are just at a very early stage of concept note or synopsis. So these are the three stages one would test.
Synopsis is when you've not really invested in development, you're looking at an idea and you want to see if the idea is something worth even putting six months to one year of development time and cost on. Screenplay is when you have a script ready and you want to, before you cast for it, before you decide to green light it and put production budgets on it, or sell the script off, or, you know, decline the script if it has been pitched to you, you may test it to understand if it's worth investing in. And final product, of course, is when you've shot the series or the film or the TV show, and you are committed to doing it, or it has been offered to you for acquisition, and you are evaluating whether you should acquire a ready film or not.
And that's the stage when the video testing comes in. Now, obviously, you know, these three stages are different from each other, and the kind of business decisions are also different. At the concept stage, the primary business decision is go, no go.
A lot of times people have eight ideas and they want to use the concept testing research to decide which two or three to focus on in terms of development, because there is always an internal preference. There will be a debate within the team that these are the ones which we think will work better. And this process of testing kind of brings some objective audience data to it, and it helps, you know, it informs the internal debate.
So the studio would have probably said, this is my top preference. So as long as it doesn't test very badly, I would still be keen on doing it, versus, you know, something that you may not be very keen on, but the results make you rethink. So that's something that happens.
In screenplay stage, the focus is on development to understand if the characters are working, if the conflict is working, what can be done to make the script stronger. Many cases we have tested scripts and then they have been rewritten, and we have tested the rewritten version and seen if the issues that came up in the original version have been addressed. At the film stage, while a lot of people, you know, say that once you've shot something, how much can you really change?
That's not very true, especially in series, but even in films that nowadays, I think the time from which production is ending to the time when releases or launch is happening is often six months, 12 months also. So there is a lot of time to work on the edit, sometimes reshoot, and a lot of, all these three stages, especially the script, the video stage, the last stage, a lot of marketing inputs come from the research. What kind of trailer to cut?
Who's the TG? Should the film be marketed to only single screens or only multiplexes or to also to multiplexes and single screens as a combination? How should it be positioned?
A lot of platforms use the research to decide what kind of marketing budgets to put behind the series. And, you know, a lot of decisions. So from studio to studio, company to company, from a Dharma to a Yashraj to a Amazon to a Netflix, the way they would decide how to use the research varies because, you know, the people are different and everybody has a different way of looking at content testing.
But by and large, these are the three stages at which it happens.
Vanita Kohli-Khandekar: And- But the bulk of what you do is at the finished stage or at the script screenplay stage? Just some sense of that. Bulk of it is a finished film or a show coming to you or a bulk of it is scripts and screenplays?
Shailesh Kapoor: In streaming and theatrical, I would say around 60, 65% is at the finished product stage. And the remaining is at either the concept or the screenplay stage. In linear television, almost all of it is at a concept stage because, you know, there's once it's daily soaps or like long form.
So once you- You're not really writing scripts beyond a point. You're more green lighting a show based on the broad idea of what the arc over three months or six months will be, which becomes more the concept. But yeah, till 2017, I would say almost all of it was video stage in films.
Then slowly, we also started developing a stronger tool for script testing of these concerns. Are there in script testing, which is that how do you capture the director's vision? And the truth is you can't because, you know, director will shoot it in a certain way.
But the idea of script testing is to be able to capture the characters and the core story or the essence of the film via the stimulus that is used in the research. And most of the time, we have seen that audiences react to characters and conflicts. And if they get them, rest of the things, they can visualise fairly well.
Vanita Kohli-Khandekar: So slowly- So in films, is it finished product testing more often than in films? Just some sense of what it is in films.
Shailesh Kapoor: So in both streaming and films, it will be 60, 65% finished product. Okay.
Vanita Kohli-Khandekar: Is there an example here, Shailesh? Can you give me a sense of, you know, either a film or a show, could you share?
Shailesh Kapoor: So for example, when you, like if you look at somebody like Dharma, you know, they test all their films at the finished product stage because generally they're not really, as an idea script testing doesn't work for them, but they really benefit from the finished product stage. So when, like if you take a film like Rocky Aur Aane Ki Prem Kahani, it was tested across four to five markets one or two months before the release. And one of the interesting things that I remember came out was that despite the film being Delhi-based, the performance was stronger in Mumbai compared to Delhi.
And that was because the film is slightly progressive in its themes in the second half, and it's talking about gender equality in a certain way. And Mumbai is a market-based response better to more progressive themes compared to Delhi. That information like this, while the film itself tested quite well and there was no real concern with the content, but information like this does inform the marketing campaign of the film.
And you know that what kind of assets you'll have to bring out for a market like Delhi to bring them on board differently, maybe as compared to certain other markets. And because today we have social media and you can really, while there is always a main trailer, you can create assets and your target specific geographies and demographics through Instagram and through Facebook and through YouTube. So there is always an attempt through researchers like this to understand differences between markets, age, gender segments, and accordingly create a campaign which kind of ticks the right boxes for the right TG.
So yeah, so I think at the finished product stage, a lot of times when the results are good, it is reassuring, but there'll always be some takeouts from it. But then there have been cases, like if you talk about a film like Queen, we tested way back 10 years ago, little more than 10 years ago, or a film like Badai Ho, you know, there the results were far surpass the expectations of the studio. So what it, because those films are smaller in their starcast/scale scale, but the results kind of made the studios, which in Queen's case, it was Viacom, and Badai Ho's case, it was Junglee Pictures, made them believe in the film, you know, back it more with marketing muscle, kind of give it a bigger release.
So a lot of times it could just come down to these things now. So it's not about just testing content and giving feedback and changes in content. Of course, that also happens, as we know the case study of Mulk that we have spoken about earlier where based on the testing, the director recast the character and cast Kumud Mishra as the judge and the version we tested didn't have that actor at all.
So sometimes the changes could be fairly significant in terms of the content, but a lot of times it could be in terms of the distribution, marketing, and those kinds of aspects also which get influenced through the results.
Vanita Kohli-Khandekar: When you say you're testing the script or you're testing the film as a finished product, the test for characters and story, et cetera, but is there also some sort of prediction involved in terms of what is the film going to be doing in box office? Or that's a separate, is that a service which, or part of the content or concept testing? Because it's a 13% sort of margin of error.
Shailesh Kapoor: No, it is a part of this. So see, we have a separate tool which predicts the first day of all the films that are releasing. So that is cinematics, that is separate.
What happens is that when you, suppose we test a film like Rocky or Ani, then it gives us a score out of 100 on how much audiences like the film. Suppose that score is 65, which is a fairly good score. Now, based on that score, what the tool, which is kind of an algorithm of sorts, based on past data, will predict is that if the film opens at 10 crore, then this is going to be the lifetime box office.
If the film opens at 12 crore, this will be the lifetime box office. It will give three scenarios, a realistic scenario, realistic plus scenario, and a realistic minus scenario. Because how much the film will open at is not something that is a function of the content itself.
So for that, you need to test the trailer and test the posters, which is also, campaign testing is a separate service. So that's not, technically that's not content testing. That also is happening a lot of late because the importance of getting a good trailer out is something that everybody understands now.
And earlier, the concern with campaign testing was that, you know, there will be no time. People will have a trailer ready and they would want to launch it in three days, one week. And so of late, we have, you know, created a campaign testing tool, which gives you results in four days.
So because of that, a lot of trailer testing is also picked up, which is also part of the testing kind of services only that we provide. So in that sense, sorry, I lost track of where I was.
Vanita Kohli-Khandekar: We were talking about whether box office can be built, revenues can be built into the concept testing. You said they're separate parts of the, yeah.
Shailesh Kapoor: Yeah. So essentially when you're testing a film or a script, you'll get three scenarios of realistic plus, realistic minus and realistic. Now at a script stage, you may not have your star cast.
So then one common section in the script testing is that if I take Ayushman, then what is the potential of this script? If I take say Vicky Kaushal, what is the potential of this film? And for that one would need to look at what those stars opening potential is and how does that translate into lifetime box office.
Obviously a lot of this is slightly speculative because what may happen is that we test a script of say an actor like Vicky Kaushal and Uri has not come out. But by the time that this film comes out, whose script we tested, Uri has come out. Now that suddenly Vicky Kaushal has become a bigger star.
So then all the numbers will have to be recalibrated because you have to look at the star's potential at the time of the release of the film. Because sometimes there could be a lag of up to two years between script testing and the film release.
Vanita Kohli-Khandekar: Shailesh, but you know, I mean, there are a lot of questions about the mechanics of it. But my big, large question to you would be how much science can you bring into this? You know, how much of the risk, see by its very nature, storytelling, entertainment is risky.
And what tools like yours Ormax Media does, or even in Hollywood, you remember we spoke to Kevin Goetz in LA and they do is try to minimise the risk and bring a certain level of certainty, which is totally fair. But how much science can you bring in? How much is the vagaries of the business and just sheer dumb luck, as one would say?
Shailesh Kapoor: Yeah, I think that's obviously a very relevant question. And we get asked that question all the time. My answer to that is that, you know, our comparison of testing is to not testing at all.
So the idea of not testing at all is that you're opening yourself up to a Friday when the film will release or a Friday when the series will drop. And then, you know, within 15 minutes, half an hour, one hour social media is anyway giving you a verdict on how the content is and what worked, what didn't work. So there is no hiding place today in the real world anyway.
So a process of testing essentially is giving you the same feedback three months, six months in advance and giving you time and space to reflect upon it and see if you want to make changes in the content or market it differently. And I feel like our whole approach is that a research report or a testing report is not trying to impede creativity. It's actually trying to enhance it by giving the creator, the director mostly or the showrunner for a web series material to reflect upon.
I think most, we, for example, as a very, very strong rule, we do not give creative recommendations. Like we will say this is a problem, but we will not say, put a flashback scene to solve it. Because we are clear that we are not writers.
It is not our job to say how to fix a problem. Our job is to highlight what is bothering the audiences or what they did not understand or what they did not connect with. The creative solution is best left to the person whose project it is or whose idea it is.
So one doesn't want to step onto writers and directors and producers toes and tell them that this is how you could have done this scene or done this climax or whatever. But one can definitely say this is what audiences thought about it. And these are some of the things that you should think about.
And I think with that approach, we have over time, a lot of creative people, including some of the top directors have become comfortable because nobody likes to be told that you made a film like this, but you could have made it like this. But when people are told that this is what audiences did not understand, I think there is some truth in that that people resonate. Also the process is quite transparent, which really helps, you know, because a lot of times we have seen directors coming and sitting in the CCTV room and watching the focus groups on hearing them on the headphone and seeing their films being appreciated or at times being ripped apart.
The people sitting in the next room, people are not aware that, you know, the director is sitting and watching them because it's a standard format in which focus group happens. You know, you sit in a room and you watch the video on CCTV, but when you see it happening in front of you in real time, it's very difficult to unsee it. So it's a process which we have kept very transparent intentionally.
It's not just a report which lands on your table and, you know, this is what Ormax is saying. You're actually part of the field process and we encourage people to come and watch the process unfold in front of their eyes.
Vanita Kohli-Khandekar: So quickly, what is the process, if I was to ask you? And also, you know, I remember the last time when we spoke about Mulk, you talked about, you know, changing the character, improved the movie's takings at the box office, helped improve or at least had some impact. I mean, is there some measurable result like that at the end of it?
Or is that something not every filmmaker or studio is looking for? So there is two questions here. One is, take us quickly through the steps of the process and then...
Shailesh Kapoor: Yeah. So I'll start with the process. The first step when we get a brief from a producer or a studio or OTT platform is to lock the design of the research with them because it involves saying which markets we want to do the research in, what is the demographic composition of the audience?
For example, what should be the age and gender breakup of the audience and any other criteria. For example, if you're testing a horror film, should we go only to audiences who watch horror films or should we also go to those who do not watch horror films? Now, all of these things, because we are sitting on a lot of data, we have a recommendation to make, but then the studio understands their film as well.
So there is a... There typically takes a couple of days and this recommendation happens after we have watched the film or the series so that we know the content as well before we recommend who should you go to because you don't want to go to a person who's never going to really even consider buying a ticket for this film. So this process, once it's locked, most of the time it is anything from two markets to six markets, depending on budgets, depending on the scale of the content and all of that.
So suppose a study is locked with four markets and we decide the markets. Now in each market, a date is fixed on which people are invited. So we have a panel of around three lakh people right now, which is a growing online panel.
Now through this panel, we invite people. The entire process happens offline, but the invitations go online. People are invited.
They're told that on this date at this time, a film is being tested. Typically, we would give the lead star's name out as an invitation, not say the name of the film. We will, for example, for Rocky Rani, we would have set a film starring Ranveer Singh and Alia Bhatt.
Most people could figure out what film it is, but it's just a way of inviting and that's how it happens in Hollywood also mostly. People who are available will, you know, RSVP and accept that invite. It is an incentivised process.
At the end of the process, they will get a gift voucher like BookMyShow or Amazon. And so they come to the venue. They know that they have to spend three, four hours there.
They watch the film before the... Now there are two parts of the sample. One larger part of the sample takes part in what is the quantitative research where they have to fill questionnaires and a lot of data is collected, which then goes through our tools and models to give certain scores, which people who use these products over time get familiar with us.
Because for example, there is a score on pace, there is a score on comprehension, there is a score on each character's appeal and many, many, many other things like that. And then some people sit in the focus groups, which is a more qualitative discussion on what they like, what they didn't like, where a researcher moderates the group and probes them for things that they have to say about the film. So these two processes happen in two different rooms after the film screening is complete.
And then this data is collected across all the four cities or five cities, whatever the number of cities is. And then it leads to a report which seamlessly combines the quantitative data and all the numbers with the qualitative insight and tries to present a larger story of what is the score of the film, as I gave the example of Rocky aur Rani 65 being the score, something like that, out of 100. And then all the implications of that score in terms of box office, in terms of what people like, what they didn't like.
There's a lot of diagnostics. So there's a full docket which goes out. So this is the process.
Now no one person attends more than two researches in a year because we have a six-month cooling period. Because otherwise what happens is that if the same people are coming again and again, they become too informed or expert-like about the process and we want to keep it as layperson as possible. So while there is a huge panel in place, there is a cooling period of six months which we enforce quite strictly.
Vanita Kohli-Khandekar: So this is- In one city, how many people attend a screening?
Shailesh Kapoor: Typically it is 75 to 100, but certain screening platforms go for 150 as well. So I think it varies. Lot of companies have their international policies on research.
So some of the MNCs bring that thought process that below a certain sample, we should not research. But in India, homegrown production houses and all are starting off. So it's a case of budget essentially more than anything else.
So I think it varies depending on the size of the company.
Vanita Kohli-Khandekar: Okay, so you took us to the process. And then do you sort of build in that, listen, if you do this or if this can be tweaked, then the film could be doing or a show could be doing so much more better. Can we come to that?
Shailesh Kapoor: It becomes, we are asked that question, but the thing is, it's very speculative. Till you do it and test it again, there is no way of doing that. In Mulk's case, there was a forecast.
And when they made that change in the film, you could see that actually what happened at the box office. And the difference of that kind of is one of the explanations of what the change could have. But there could have been other factors.
So it's, but it's not just that one thing. In scripts, this happens a lot more. One script is tested.
It gets rewritten based on the research feedback. Six months later, it'll get tested again. Then you understand what the difference is and whether the score went up, whether the potential at the box office went up.
Of course, when the final product, very rarely, but few times we have tested a film twice where it will be tested once, changes will be made, it'll be retested. It happened with web series also a couple of times, but most of the time, I think, the platform or the studio is not really that interested in being technical about it. They say once we've got a sense of what we need to change, then the creative process will take over.
Vanita Kohli-Khandekar: But do you find, and this is something I think I noticed earlier also when you've talked about content testing, do you find that the Hindi side of things is more open to testing and research than other cinemas? Because I don't hear of any other cinemas being tested as extensively as Hindi.
Shailesh Kapoor: Or is that- For sure. I think it's more about the national players versus the more region-specific players. So whether it's a Viacom or a Zee or a Star, Foxstar, Star, Disney, whatever you call it, they do test a lot of stuff in the South.
But it's the local players in the South which are not doing it that much. So I think it's when you have more all India kind of an approach, what you're doing in Hindi, obviously you will want to replicate it to whatever extent your budgets allow to other languages. In South, so you see, in Hindi also, when we started doing film testing in particular, so TV industry at least was using research in various ways.
But film industry, when we started in 2010, 11, and we started talking about film testing, people couldn't believe that this is something we're even offering as a service. Because it was like, how would you test a film? My film will get leaked.
Because that was first thing. You know, it'll come on social media, it'll come on YouTube. So from the basics of privacy and how do you make sure that nothing comes out, even something as simple as a respondent in the research does not put a message saying, I watched this film.
So the whole process of security and all the protocol around that is something that initially people would be very unsure of, especially for a thriller film, which has a climax, which has a big twist and stuff like that. I think in Hindi, it took four, five years for people to slowly get used to it. And then, you know, they will hear from each other that, oh, you tested Drishyam, you tested, or like, oh, film like Drishyam, when you test, you know, it has a climax, which pretty much, if you know the climax, the whole film has a climax.
But then, you know, when you realise that it is being tested and you know, in the film industry, word does spread very fast. So I think in the South because we've started only three, four years ago in some serious way. So I think that lag is there because I feel even in Hindi, we have really tried to do this conversion process of converting non-believers into sceptics, into believers of risk.
So that process has just started. Sorry. Sorry, I- That process has just started a couple of years ago.
So maybe with just a lag of a few years, I think in two, three years, hopefully we should see more South, more people in South testing more stuff.
Vanita Kohli-Khandekar: But how long does the process itself take if you do a film or a script? I mean, typically, moda muta, how long is a process and what does it cost?
Shailesh Kapoor: Processes duration, two to four weeks, depending on how many cities mostly. And as I said, for trailer, it is four to five days. We've tried to keep that tighter, but two to four weeks, depending on number of cities.
And cost, again, would vary from six, seven, eight lakhs, depending on for a two-city thing to around 24 lakhs for a five, six-city thing. So roughly one could say it will be average around four to five lakh rupees per city for a film testing. But sometimes, again, it's a function of what sample size and what kind of TV are you going to.
So it can vary a little bit.
Vanita Kohli-Khandekar: But my point is that as a percentage of, let's say, a studio's P&L or a project cost on that particular film, testing is not very expensive. Is that- Not at all, yeah.
Shailesh Kapoor: Not at all.
Vanita Kohli-Khandekar: So the challenge is not pricing. What is the challenge? I mean, I'm assuming directors resisted, creators, or is that phase over now in India?
How do we compare to an LA for Hollywood, for example? You know?
Shailesh Kapoor: Yeah. So I think, yeah, that's very interesting because the challenge phase is over, but then there is a next phase where you start putting research costs as a part of your film's budget. That phase has not happened beyond top four, five companies.
So when you're budgeting for a film, you are keeping 25 lakhs, 30 lakhs aside, saying this will go in research. It is almost like a line item in your budget, rather than a marketing overhead, which comes later and you begin to worry about, okay, do I have this money? Where do I take it out from?
So I think initially when we started, nobody had this budget. So people were like, okay, I want to test it, but I'm already over budget on the film or on the marketing. So somewhere I have to swing out this 15 lakhs, 20 lakhs, 10 lakhs, whatever that time the number was.
Now we are seeing that with the bigger studios who are testing regularly, they are actually budgeting this in right at the start. It is part of the marketing budget mostly. So it is like saying that, okay, I'll buy media worth this much.
I will pay my ad agency this much. I'll pay my trailer making agency this much and I'll keep this much for Ormax or for the research part of the project. So this stage is ideal stage because basically you've entered the budget so you've become like a line item.
Yeah, so this has not happened fully even in those who are testing because a lot of people have one film coming out in a year. So those who are doing it regularly, I've done this more, I would say with certain rigour. Sometimes people have this thing that they'll test only if they are unsure of the film, which is a case of that I will, if I know the film is bad, why do I really need to test?
I know that audience will reject it. So at times people have gone that.
Vanita Kohli-Khandekar: Are there people who come with the knowledge that my film is bad, but then why the hell should I?
Shailesh Kapoor: No, like a producer would know that, a director may not know that, but as a producer, you may realise that you commissioned a film and what final product turned out to be great. So you want to cut your losses and not keep spending on marketing and research and all these things. So at times that could be the case as well.
But yeah, I guess today the way it is that you can't really rely on stardom and say that a star will get you a certain opening because content and audiences are obviously becoming very unpredictable and selective at the same time. But I think there's even more merit in testing now because earlier there would always be a thing that if it's a film with a certain star, a certain minimum number was guaranteed. Now there's no lower limit of how low a film of a star can go.
We have seen what has happened with films of some of the biggest stars in the last one year which have opened at one crore, two crore.
Vanita Kohli-Khandekar: You know, Shailesh the outliers, you know. I shouldn't use the word outlier, unfortunately. So if I use a Manjumal Boys or if I talk about a Munjia or if I talk about- Twelfth Fail.
Has Ormax, where these tested, has Ormax said something or predicted something? There are two questions here. One, you know, when these, I'm using the word outliers.
I mean, Vidhu Vinod Chopra film is as mainstream as possible, but the star cast was not big. So perhaps I'm, and how would you measure your success rate, Ormax's report card? Because content testing attacks the heart of the creative process.
And I can imagine the kind of barriers you would have faced when you take it. And even now you face. So where do you, you know, use those?
Shailesh Kapoor: Yeah. Attacks may not be the right word. The idea is that we're trying to be, yeah, but it's definitely, it is relevant to the heart of the creative process.
See, unfortunately we haven't tested a Twelfth Fail or kind of a film in recent times, but in the past we have, and I think like Andhadhun is not that big an outlier, but it outperformed by some margins. We tested it and it tested well. And there was a sense that it'll do around a certain number, like 40, 50, 60 crore.
But it then went on to cross 100 crore now.
Vanita Kohli-Khandekar: It's a good film.
—
Vanita Kohli-Khandekar: 12th fail collected Rs 70 crore at the BO..budget Rs 20 crore.. And here I would like to add that box office collection means literally that.. the money collected at the theatrical window. This is then split three ways – taxes, trade share and the studio/producer’s share. So when a Pathaan collects Rs 1,000 crore it doesn’t mean Yashraj Films gets that Rs 1,000 crore..They probably end up getting around Rs 300 crore. To this add other revenues - TV, streaming et al. From the total, deduct production, marketing costs before you can arrive at any figure on the profits a film makes. So when a newspaper or anchor tells you that a film made Rs 1,000 crore, that money hasn’t reached one actor or one studio. There is a value chain and the money goes through that entire chain hydrating it. Many films, say Kabhi Khushi Kabhi Gham, continue to make money from relicensing across different markets in the world. Back to the podcast..
—
Shailesh Kapoor: Yes. And so our thing is that, that kind of error we still find okay. If once in a while it has to happen, that we under predicted a good film.
The one which we really hope doesn't happen and it's not happened for a while is that you over predict an average or a bad film. That you say that this will do well and it doesn't.
Vanita Kohli-Khandekar: Because what happens- When did you last time do that?
Shailesh Kapoor: It may have happened back in the day, 2014, 15. I don't remember, but that's something that has not happened for many years now. And I think, so there is some element of conservatism built in our models.
Because the idea is that from a producer point of view, if you've been told that you'll do 80 and you do 100, that is fine. But if you're being told you do 80 and you do 60, that's a bigger problem than doing 100 instead of 80. So that we're very careful about.
Having said that, none of this is human intervention. So all of this is via statistical models and predictive tools. So models learn over time and instances happen and then those instances go as inputs and subsequently those get addressed.
So from that point of view, it's been fine. But one thing I have to say because this 12th film example came up is that though we didn't test it, now the problem with an outlier is that everybody wants to be an outlier. Now every other small film thinks that they can be the next 12th film.
But the thing is that it's an outlier because it'll happen probably once in five years or once in three years. But all small films are now like thinking that they can be the next 12th film. Now that kind of, in a way, gives you a false sense of confidence about what you have made and all of that.
So that's the flip side of having outliers that they can misguide people into believing the content is better than it is.
Vanita Kohli-Khandekar: I thought Laapata Ladies would be the outlier but like 23 mein 12th film was the outlier. I thought Laapata Ladies would be the outlier. I saw it in the theatre, but the theatre was half empty.
And then three months later, I have had people walking up and saying, have you seen Laapata Ladies? I said, I saw it in the theatre. You're now seeing it on streaming, but it never be.
And I remember speaking to Ajay Bijli also that many of these films do not reach their full potential in the theatre because, and this leads to my next question, because the market has changed even since you started in 2010, 11, you know, and started. And I think at bulk, you've been testing only over the last five, six years, if I'm right. So, you know, the market has changed.
You have a far more distracted audience. I was speaking to Prasoon Joshi the other day and he said, you know, there is content indigestion. It's so much of a glut that just, you know, so, and there's some new statistic floating, say that the attention span of human beings is less than goldfish.
I don't know how scientific, but my question to you, therefore, is that you have a much different market than let's say 12, 13 years ago. It's a different film business. It's a different entertainment business.
How much do you think your process needs to change or has your process changed to capture this? And has it got back? See, because your models are, the regression equation will remain the same.
Do you tinker with the variables in that equation? What do you have to do in order to be up to speed?
Shailesh Kapoor: So across all our tools, across all domains, we, once pandemic happened and we post pandemic in 2021 work started again, we discarded all old data. We actually do not have any pre 2020 data in any of our models or tools now because it is so fundamentally different in terms of how audiences are behaving, how genres are performing, what kind of variables are impacting. And I'm not talking just testing.
I'm saying even any other work, we do consulting work or any other work. We essentially, now it obviously pushed us back by some time because when you build five years of data and you're building tools on the top of that, but of course, once you have done it, your learning has come in terms of what kind of variables to look at. So when you start building the new data, you will not start where you started right at the start.
You will obviously have the advantage of having gone through the learning curve, but the data will still be built new. Fortunately, what happened was that in 2021, 22, a lot of content came out in both streaming and theatrical because content, there was a pipeline clogging which had happened because of theatres being closed and all of that. So it didn't take that much time to rebuild some of that data and kind of create those models and make them function again.
But nothing that we're putting out today has anything pre-2021. So in that sense, we are now on four years data, but because in this four years, a lot of work has happened and a lot of stuff has come out, it is still a good amount of data that is there. But yeah, I mean- No, base year is 21 then?
Yeah, base year is 21 onwards, I would say.
Vanita Kohli-Khandekar: But this is your factoring in what is happening in terms of your models. In your sampling, how does that change? Because you have a different audience, I mean, are you mixing the ages, the demographics, the psychographics?
Because this is a different audience. This is a short video audience.
Shailesh Kapoor: Yeah.
Vanita Kohli-Khandekar: Their appetite for long form, I think has gone, has sort of very badly impacted.
Shailesh Kapoor: No, absolutely true. And I think in both streaming and theatrical, in TV, at least viewership data is there, so it's slightly easier. In both streaming and theatrical, our approach has been that we have to have category data available with us too, because each platform has their own data, but nobody has category level data.
As you would know, we do this annual sizing report for OTT and we do this annual sizing the cinema report. Now those reports, well, they're also monetised and all of that. The primary purpose of doing these annual reports is to size the market and say, what is the gender breakup of S-quad audience?
What is the gender breakup of theatre going audience? How does it vary by Hindi, Tamil, Telugu, and all of that? And that is based on primary research because no theatre can give you that breakup either, because tickets are not being booked by individuals, they're being booked by a person, for a family or whatever.
So only way to get this data is to collect this data periodically and ground up and use, because otherwise one can sit and do some broad approximation and say, it must be 60-40, but one needs a large sample research once a year to say it is 63-37 and not 60-40. And then 63-37, what is the age breakup of that? What is the market breakup of that and all of that?
So this data is something that is an annual process, a streaming report. In fact, that we do every year in September, August, September, something which actually served a big need because a lot of platforms, like, so for example, Amazon will have its own data, but there is no way for them to know what is happening outside their user base. How big is, like, if they have got, say, X million subscribers, what is the total pool?
What are they fighting for? What is the upper limit? So these questions are very relevant and obviously all sample-based research comes with its own error margins and all that, and everybody understands that.
But at least there is some data which is coming to the system. So that, in a way, we have seen that post-pandemic, there have been changes in the audience profile. For example, the number of films in the audiences are watching has gone down, but the number of people have gone up because only the big films are working, right?
Vanita Kohli-Khandekar: So- Yeah, we saw this in 23 also. Tell me, and this is probably my last question, but tell me about a big success and a big boo-boo or a big leak, if ever something like that has happened.
Shailesh Kapoor: Leak has never happened? That is, like, something we've heard.
Vanita Kohli-Khandekar: In the Bakshi Kumar film, I remember you mentioned something had happened.
Shailesh Kapoor: Okay, so that, like, what I mentioned was that the only leak that has happened is somebody posted on Twitter that, I have watched this film.
Vanita Kohli-Khandekar: Oh, okay.
Shailesh Kapoor: And, like, to us, even that was a leak of sorts because even that is not, like- Yeah, and they sign NDAs and all that. We got that tweet removed and all of that, but there was nothing about the film in that tweet, or nothing about the story or the content of the film itself. So leak, we are quite, I think our processes are far advanced now than they were, say, a few years ago, but it still comes up as a question for first-time users of content testing saying, but my film will come out.
People find it hard to believe that in the age of social media, something like this can happen and it will not come on Twitter or Facebook or wherever. So, but I guess that's interesting and it's a good conversation to have. Successes have been, I mean, I don't know, like, it depends, right?
Vanita Kohli-Khandekar: I know you can't name many of them because you have 1,000 pieces of content, but you can't move 1,000. But anything that stands out, even without naming, if you can just tell me something that stands out in your head over 10 years of doing this.
Shailesh Kapoor: Yeah, like a project with came to me, personally, I think, and even the team at Ormex, a lot of satisfaction was when we tested Kapil Sharma's debut film. I think it's around 10 years old. Yeah, he did a film, like, Kisk Kisko Pyar Kurwa, Abbas Mastan directed it. Now the film itself didn't test that well, but the whole thing was that he was a household name and his show was at its peak that time. And the whole challenge was that how do you use this research to understand how the film should be marketed? And through the research, this idea came out that you should, you know, not sell him like a star, you should sell him like a family member and pitch him as a family member.
And the entire campaign was him going to different cities and interacting with audiences, more like family. So it was designed very differently from a typical film campaign. And like that, back in 2015, I think it came, it opened at seven and a half crores, which was the highest opening for debut.
Yeah, which later, I think, student of the year overtook that a couple of months later, or maybe around the same time they came, I'm forgetting the timeline, but they were in the same range, seven and a half, eight crore. So I think that's an example of how I would say an average film, not a bad film or not a great film, can actually benefit from research in terms of how to market itself and how to get an opening, which then becomes the foundation for how you grow on the top of that. So that's something that I think is an interesting example in terms of, see there have been films where we have not, like Andhadhun's example I gave, like Raazi also equally in the same year, where we, the feedback was positive, but we could not get the degree of appreciation that the film would get.
And I think that's something that over time, obviously we got better at it, but like there were a couple of instances like that in that 2018, 19 period when- Raazi was a big hit, right? Yeah.
Vanita Kohli-Khandekar: I remember seeing it in the theatre twice actually.
Shailesh Kapoor: Yeah, really nice. Raazi tested quite well, but the thing was that what it did was even better. So from six- Capturing that- Yeah, so I think that's something that again is a function of the tools and the models have to get better at it all the time.
And of late, it's been really good. I think last one or two years, we also feel that we've, despite all the uncertainties in the post-pandemic scenario, the equations and the models seem to be working fine. So thankfully, that's the part which we're feeling we're getting better at.
Vanita Kohli-Khandekar: Do the creative people hate you or dread you?
Shailesh Kapoor: I think they're amused by us also at times. They can't fully get to the, somebody then says, but if you have so much data, why don't you make your own film? Oh God, this is like shooting the postman.
Why are you telling me this? Yeah, it's like telling a critic, why don't you make a film instead of critiquing myself? But you know, the thing is that, I mean, everybody doesn't have to make a film.
I can't make a film, but there is something we specialise in. I think people have, we've earned the respect over time. So this is the thing, even if people don't like you, they understand that what you're doing is interesting and it has its own place.
And it's nice. I think a lot of times people have misinformation. Somebody will tell them they do this and this film they got wrong, some example.
And the 10 things which are right, it's the nature of forecasting of any kind that you get one out of 10 wrong, the nine which you got right, nobody will speak about. Like right now, we're in a phase in terms of the other work we do, tracking and forecasting the opening of films. Bhool Bhulaiya, Singham, Pushpa2, everything has been pretty much like within 2%, 3% of the forecast.
Now this is the phase everybody's talking about, oh, Aarmak's forecast have quite banged on. I know one film will happen where it will be off, Off and then they will agree on your case. Then the entire narrative changes.
So it's pretty much like it is at the box office. Your last Friday decides, or last film release decides how good you are. I think God knows.
You'll break Fridays, I suppose. Yeah. But one has to be a little resilient about it.
But I think it's fine. One understands that for somebody who's making a film, a lot is at stake. So they don't want to be the one at the wrong end of accuracy.
Vanita Kohli-Khandekar: Fantastic. That was really, really illuminating. Thank you so much, Shailesh, for joining us.
—-
Vanita Kohli-Khandekar: The reason Ormax is becoming important to the entertainment ecosystem is simple – the market keeps growing and the competition keeps increasing. That means there is more and more at stake here – revenues, jobs and entire companies --- because production is a fairly fragmented business in India. OTT generated Rs 31,000 crore in advertising and pay income in 2023, TV was about Rs 70,000 crore. Take films .. Of the Rs 20,000 in total revenue, over Rs 12,000 crore or two-thirds comes from the box office- this is the sales of tickets in theatres. It is by far the single largest source of revenues for a film.. the others are streaming and television. However it is BO success that decides the price that streaming platforms and television firms are willing to pay for rights. How a film could fare, however, remains a mystery? It is the magic that happens between a filmmaker’s vision and his audience. Content testing is the science that tries to measure how that magic might work..
What goes into content testing for a film or web series?
What goes into content testing for a film or web series?