Webinar Transcript: The Million Dollar Risk


This is a transcript of the webinar The Million Dollar Risk hosted by Gene Quinn, founder, and editor of IP Watchdog, which is available now on demand.


Listen while you read (recording here).

Sandy: Hello everyone. Thank you so much for joining us today. We are joined by four esteemed IP experts to discuss the topic The Million Dollar Risk How bad data leads to bad decisions. Gene Quinn who is the founder and editor of IP Watchdog is going to be our host and moderator, he will lead the discussion. And he's joined by Tyron Stading who is the President and Founder of Innography, and he is also the Chief Data Officer at CPA Global. Along with Tyron, there's Jaime A. Siegel, he is the CEO of Cerebral Assets and Marwan Hassoun, who is the Founder, President, and CEO of Green Semiconductor. So without further ado, I am going to hand things over to Gene so that he can kick things off.

Gene: Thanks, Sandy, and thank you all for joining us here today for this conversation. I think it's going to be a really good conversation and interesting conversation. This webinar has been in the works for probably about six to eight weeks of planning about the discussion and what we want to talk about and so forth. So we're really excited about this. Now, a couple things are different about this webinar compared to some of the other webinars that we've done. If you've joined us before you probably are familiar, we normally have a slide deck of some sorts to go through. There's no slide deck today so you're just going to be seeing the front slide and that's by design 'cause we want to have really we want organic conversation back and forth and not that it's going to be role playing or but it is based on off of a hypothetical.

And we have three different people that bring three different distinct levels of experience with them to this discussion. The other thing I want to say in a preface really, is that whenever you're putting together a webinar, particularly like this, and you're really trying to find people who have something to contribute based on their experience, their real world experience. I'm not going to try and put anybody in a situation where I'm going to ask them a question that they can't answer but whenever you talk about deal making and you're bringing people who do deals and who look at confidential information. There may be a risk that it will I'll bump up against a question that they just simply can't answer, and that is something that you know, hopefully we all understand and if so, if they if they say that you know, they just can't answer that question, they can't comment, we'll just move right past that, and move on and hopefully you all understand that.

Having said that, what I'd like to do, is I'd like to start with asking like I always so with our panel, to give me you know, in 60 to 90 seconds, kind of an overview of what they think about when they hear the topic that we've put together. So let's start with Marwan, and then we'll go to Tyron and then we'll have Jaime, you weigh in on this. When you hear "Bad data leads to bad decisions"; what is the first thing that comes to mind, and what do you think that the people listening on this webinar ought to have in mind as we approach this topic?

[Silence 00:03:27 00:03:36]

Gene: Hello?

Marwan: This is Marwan again.

Gene: Okay. Can you hear me?

Marwan: I can, did something happen?

Gene: Yeah, no I just asked you if what when you hear the topic, how "Bad data leads to bad decisions", what is it that you first hear? What is it that what is it you first think of? What is it that the people in this webinar should have in mind?

Marwan: Okay, I apologize, something happened with my connection and I dropped off for a second.

Gene: Okay.

Marwan: Okay. So, for me, there are several things that come to mind when I hear it. Some of it revolves around incomplete data and incomplete analysis, for me. For example, when I think of whether mergers and acquisitions, due diligence, looking at a competitive landscape, litigation issues, it's incomplete, analysis of the data and IT landscape.

So, for example, in a merger and acquisition situation, you know, looking at perhaps the only issue patents for example or the pending applications, without doing a proper analysis above and beyond. To me, that's bad data coming into the system.

Another thing that comes to mind for me when it comes to bad data is, and again this has a lot to do with incomplete data, is looking only at US issued patents in doing analysis. And if you think about it, you know, some of that data is really, from a competitive landscape point of view point, two to three years old. That it takes that long within a US patent office, so there needs to be you know, more there's more to be looked at as far as the data. And of course there's the basis on when you're just getting wrong data, and again, when you pull in your raw data, what you have is incomplete. You have applications missing, you have patents missing. So, in general, that's what comes to my mind when I think of bad data that's used to make decisions.

Gene: Okay thanks and now to Tyron. As the creator of a data-centric software solution, I know you have got a lot of thoughts on data and you have to have good data or to put together what you're doing. When you hear about this, what are the thoughts that come to your mind?

Tyron: Sure, I think that quite the bottom line is that no one has good data because either they're entering into the systems or the patent offices have created a lot of gaps or inconsistencies in the data. The first thing that comes to mind is more around ownership and change of title issues, something that they everyone struggles with and it cascades throughout. Everything from normalizations, and here's the 1,800 misspellings of IBM by them to the subsidiaries, and the acquisitions that those companies have made, so they actually know who the entity is. 

Looking at chain of title issues, they can corrupt or impact the ownership aspects of it. Which take into the fact that everything from securitization to, a chain of title issues around inventorship, and having continuation in parts. Not having, at least one inventor in common, just a multitude of those types of problems. Then you get into you as it was mentioned about the complete data, you know terminal disclaimers gets really interesting, and looking at those relationships because that has a cascading impact on things. Inventorship and the standardization of inventors and the quality that the inventor data, really interesting area is that it comes into it.

But how that really impacts generally the competitive landscape, so companies make decisions on their own data, whether that's renewals or protection strategies, going into freedom to operate, issues around competitive landscapes or entry strategies, portfolio management decisions for renewals and prosecution analytics, M&A. So it just goes on and on but at the core of it, it's taking the raw data that the patent offices have and then compensating for gaps, issues, just bad data that they have from inventors, companies, just a number of those types of things. 

For example there was a report about the latest, USPTO grants and who was the leaders in that space and that that list has probably I think a 15 to 20% error rate, just in the list of what they were, I think the top 25 people, and that's one of the easiest jurisdictions to deal with, and so making decisions or statements about that type of data can easily lead you astray into a wrong assumption or direction.

Gene: Okay, thanks Tyron. Now Jaime, we've heard from Marwan, and Marwan kind of he goes into the data and tries to pull as I understand, pulls reports and works the data and puts together reports and answers questions for clients, and that sort of thing. Tyron is creating the tool set that goes into the data to create all this corpus of information and make sense of it. You are really at the end of the day, the representative here who is the deal maker, both on the litigation and licensing side that's going to need the information in order to make the right choice at the end of the day before you sign for your company or your client. What does this mean to you? It sounds to me like it's a mess. How do you ever know you even have the right data?

Jaime: Well thanks Gene. I think the you know it's interesting the type of bad data, I think often times it's less about bad data than it is about laziness in obtaining data. The data's out there, it's about number one, collecting it, even knowing where to look for it. And, thinking ahead about what data you should obtain to help you make your decisions. So I'm hesitant to blame it on data as much as I am, more prone to blame it on the person that's responsible for pulling the data. So to give you an example of places where, data a good comparison would be in in a law firm, and if you're if you're writing a legal brief, it would be like only citing cases in your brief that held against your position. Right, now no lawyer worth their salt would do that but that's what goes on with people's use of data, often.

Things like, what your competitors are have done historically, what judges have done historically. What the what the data shows in terms of what a certain venue or jurisdiction is likely to do in a in a situation. I come across this all the time of people using old data or no data at all, just based on supposition. So, I really think that from the business perspective as leading the deals or leading the projects, it can incumbent on you to direct your team to get the most current data that's available. Which, the data's always available, it's just about digging it out.

Gene: Yeah, okay so now this is really interesting 'cause I we've asked sort of the same question to of all you guys and we I think that three different perspectives here. We had, you know, Marwan talking about the is the data complete or is it incomplete? And that's a that's a concern for the and then Tyron's talking about, well on the one level, going through and figuring out what the data is and how to justify it or normalize it maybe, can be really problematic. Simply like even just for something that's dealing with a misspelling. And now I'm what I'm hearing from you Jaime is that, the bias of the person who you're asking to task with this, can be can be problematic as well. So, this is a multilayer, problem that really does affect deals.

And the one kind of thing that we wanted to talk about here or at least, start the conversation around, and see where it goes. And to those of you who are in the audience, if you have any questions, please throughout the webinar, do send them in to us. We're going to try and answer as many of the questions that we get as we as we possibly can. But the one the thing that we teased in the registration here was the fact pattern or hypothetical that you have just recently purchased a technology company for a hundred and fifty million dollars. Everybody's happy, everybody's proud, and then shortly thereafter, you have been sued by the largest competitor in that space for, willful patent infringement. Could this have been avoided? Should it have been avoided? 

Now Jaime I want to start with you because you are, and those folks who are on the call who aren't familiar with Jaime, Jaime, was at Sony for many years, and did was in charge of managing the worldwide patent litigation and, licensing of their patent portfolios in association with that. He also did deal making with, Acacia, in a very similar capacity. Now he's off on his own and does consulting. so, you've seen this from both the licensing deal making perspective but from this very same litigation type issue where you're acquiring patents or you're talking about patents, you're brokering patents. How do you know when you've got enough information and how do you prevent this kind of disaster scenario? 

Jaime: Well, it's interesting to note the last piece in there is also mergers and acquisitions, and that was also a big part of it the job at Sony. So, that really plays right into your hypothetical. and about when do you know when you have enough data? It's experience counts in this in this field, it's difficult to just read about something in a book and to know that you're or to be comfortable enough that you're getting the getting the data that you need. But there's experience is comes in different forms. So, for example, in the hypothetical you have, you get sued for willful infringement, well that's something that should have come up in in your due diligence review, in asking for assertions that, have been made against your company.

An acquisition should include a review of what your competitors in in the space are doing. Is there a history of litigation in this space? And what parties are prone to doing that? And trying to think about if that impacts your deal. But this is it goes back to my first point which is that the data's all there, it's just about figuring out where to that you need it. so, I don't know if that answers your question.

Gene: You know, I think it does, you know, Tyron let me turn to you because when Jaime was giving that answer, the one thing that came to my mind was and I I've heard you mention this before that you've run some analytics that they're and you know, this is obviously just with a first cut of analytics but there are some telltale signs that a patent is much more likely to be litigated than another patent for example. Right?

Tyron: Yeah, and as you get into different jurisdictions it has different implications but if you look at a large enough data set and you correlate it against enough information, there are different patterns of increasing the likelihood of patent being involved with litigation. Again, you can't always account a hundred percent for it because there's various reasons that people bring lawsuits and you can always bring a lawsuit for pretty much anything but it's a matter of whether it actually goes forward. But the assumption is that people who are bringing patents to litigation cases believe that there's some value in it otherwise they wouldn't be doing it.

So, correlating and, trying to look at why are some patents involved with litigation versus patents that are not involved with litigation and looking at the cohort analysis, there are different indicators that you can, use and everything from number of forward citations, the size of your organization, the classification that it’s in. a lot of different things that you can consider, and so that gets boiled together and you and you can put that into a general probability scale. So, when you've done this we built a machine learning algorithm that uses multi variant, maximum entropy analysis to be able to look at positional and field values but look at, lots of different variables. And we've put this on a distribution curve and put it on a, relative ranking score and you can relatively position the top 30% of all patents encompassing 95+% of all actual litigation that goes on a preceding 15 year period.

So, you can remove a good chunk of the space, probalistically by eliminating a lot of the risk. Not saying that any of those patents are not licensable or not necessarily valuable but which ones end up actually in litigation. So, a lot of research and a lot of different pieces that go into it and some people need to have some of the details behind it so you can either use machine learning or you can use kind of macro formulas to be able to do it but there's certain, correlation and causality relationships that you can drive to see that patents, certain patents, are likely to be involved with litigation or which ones could be involved with litigation more often.

Tyron: Hey Gene?

Gene: Who is that?

Tyron: Sorry, I just wanted to add I just wanted to add on. Tyron raised a good point and really just to follow up on that, is that there are no certainties in, the conclusion. I mean there's certainties in terms of that you can say that the patent was assigned because you've seen assignment but you don't know if somebody's going to file an assignment. But it's about making a recommendation or a conclusion based on the data that you have, and to me it's very different to say, "I reached the wrong conclusion," than it is to say, "Ah, I should have looked at that data." one is the failure to do sufficient work and one is just, perhaps a wrong conclusion based on the data you had.

Gene: Right, right, now and this is the type of thing when you were talking about doing the proper if you had done a proper due diligence analysis, you would have at least had that kind of information, right? You would have known, "Well we're getting into this space with either this portfolio acquisition or this merger and acquisition and these are the major players and they are litigious or they're not litigious, and their portfolio is either the type of portfolio that normally gets asserted or it's the type of portfolio that normally doesn't get asserted." And then that that's critical information that you have to have, right Jaime?

Jaime: Yeah, it yeah absolutely. It's kind of like if the CFO came in and made a decision to make an acquisition simply based on the cash basis of a company without looking at what their indebtedness was. You shouldn't you shouldn't be making decisions on partial information. You would never see an accountant or a CFO make a decision without examining the entire balance sheet and people in the IP space really shouldn't be making recommendations or making decisions, whether it be in litigation or in business decisions that are based on partial information, when all the information is available, it just requires some work to get there.

Gene: Right, right. Now Marwan, I want to bring you into this conversation at this point because when we were having a conversation earlier this week, just you and I, one of the things that came up was, a part of what you do, as I understand it, maybe you can tell us a little bit more about this, when the clients come to you, with a question, a research question that they need you to dig into and give them you know, get the data for them, and get the answers for them. A big part of what you need to do is figure out, have they given you the right question for what the information is that they really need? 

And you have to go back and forth with them to figure out what is it that they're trying to determine and why do you think they need this information? And that seems to me to be absolutely critical and that came up in another webinar that we did with Innography a few months ago, when Monty Wright joined us, from GE. And he does this internally with GE, and what he said is the first step that he has to do, is figure out what information do they really need and what are they asking me? And then have that consultation, and that's what you do as well, right?

Marwan: Ah yeah, yeah that's quite correct and I actually, I have so many answers and on for this hypothetical, ah scenario. I mean both Jaime and Tyron, ah, touched on, ah, specific ah, ah, ah items there and they both apply ah, to what I do, for example, I'm let's look at this hypothetical, I look at, you know a fitness technology company, and the first thing that comes to mind you know, what comes to my mind when it comes to, you know there's you've been sued for infringement on 10 10 10 patents, my mind is just going everywhere. I said, 10 patents on which part of the technology? 

I mean is it the integrated circuit technology that's going into this fitness? Ah, is it the software that they're using? Is it the, ah, ah, point of service, ah, ah, that they're using? I mean the categories are just infinite. So, to Jaime's point, when he talks about there's incomplete, ah, ah analysis that that was done or not sufficient analysis, I would say, to do a complete analysis of every threat that's out there for a technology company, that that's going to take an in you know, a lot a lot of time and a lot of resources to cover all these, ah, ah, ah issues.

To answer more specifically to your question which has to do with, whether I asked the right question, and I'll give you an example. you know in a say a, ah let's call it a fear of litigation, ah, ah, ah, situation where, a client will come in and say to me that says, "I want you to please assess the risk from Company A against what we're doing here." Or in a situation where there's M&A, and you go, "Sure, I can I can do that but the threat doesn't come from just your competitor, your specific competitor. The threat comes from and we all know the landscape these days, it comes from so many various, ah, sources and there are companies that are not in that specifically technology field that are involved in litigation.

Ah, and of course there's a lot of other companies that that that own patents that that, are asserting them. So, that's one example where I go back to the client and work with them on, to make sure exactly what they're afraid of in order for me to provide them with the ah, ah, ah proper analysis. I work a lot with startup companies, ah, perhaps there's not a lot of maturity as far as understanding, ah, IP, so by working back and forth with them, I'm able to formulate a better question and I'm able to formulate a better, ah, ah, set of data for them. 

Another example that that that that comes to mind again, and this has to do with not, not the there's no way to do the complete analysis, is sometimes you do you know, you do an, ah, ah, analysis and, ah, there's a lot of patents out there. There's a lot of companies out there, a lot of technology, so you narrow it down to, ah, something that that makes sense, give them that amount of time to do the analysis, and then six months later, these 10 patents that get asserted, you look at them and you go, "I can't believe they're asserting these patents." 

But I think whether Tyron or Jaime mentioned some something that you know anybody who can sue anybody really, can assert any patents against anybody without necessarily understanding whether the company truly infringes or not but these will come out anyway and the analysis has, eliminated them. So, would, ah, somebody like me have been able to find those threats? The answer is probably yes, given a heck of a lot more time but, and I I'm not sure I pinned anything down here but I generally hopefully explained the, ah, the parameters that one has to deal with in whether litigation or, an acquisition situation and assessing a litigation risk.

Gene: Well you did. I raised a I think a really important issue and one that I we're going to want to talk about. and you know at the risk of asking a question I don't know the answer to, which you know, is a worry, you you're never really supposed to do. I know when people come to me and ask me about doing searches for whether it's going to be a search to do a patent application or whether it's a search to you know, they're starting to think about doing some research and development in an area and they want to get a hold of a space and see what's out there. You know, those kind of search projects and research projects I know what's reasonable, I know how to get to the you know, 80% solution or confidence level which you can do relatively quickly and how much it's going to take if they want to go higher than that, you know, and every percentage more than that.

It takes a lot more money and a lot more time, but Jaime, let's start with you and I want to get Tyron and Marwan, you also in on this too but how much confidence is enough confidence, number one? Which I know is just an impossible question, so I want to discuss that. And then and then two, what kind of amount of investment to get to a that whatever and a reasonable enough level of certainty is? Should you be investing when you're talking about one, acquiring a portfolio or two, a merger, and acquisition? Or three, making you know, we're about ready to settle this case and I want to make sure that I'm going to settle everything that this company can possibly throw at me.

Jaime: Sure, well let me before I dive into those questions, let me just I just want to comment on Marwan Marwan's comment about that anybody can assert any patent against anybody. 

Gene: Sure, go ahead.

Jaime: Ah, you know, one of the one of the things that I take a lot of pride in as an IP lawyer, is that I do think our bar is one of the best bars in the world. with super bright people, that, more often than not, are have a very high level of ethics, ah, despite what the press may say. And I there's certainly protections built into the system, Rule 11 being the Federal Protection for against Frivolous Lawsuits. Ah, so I take issue with the comment that anybody yes, anybody can assert anything against anybody. Ah, in practice it I've literally managed hundreds of patent litigations over my career. I've seen a frivolous lawsuit once, ah, and that case went away.

So, I think it's well the media likes to portray that there's this horrendous world of patents there are horrible patents being asserted, in reality it's reaching the correct valuation on what a good business deal should be, rather than, ah, a frivolous lawsuit. And you really are hard pressed to find, eh, more than just a few examples of frivolous lawsuits in the last couple decades. So, I just wanted to clarify that.

Marwan: Can I comment back here for a second Gene?

Gene: Yeah, yeah, go ahead.

Marwan: So, I apologize if, you know, the statement was generic as such. I was actually talking about, very you know, very specific technology. It actually looked like a legitimate, ah, ah, lawsuit, ah, coming in but when you look at the patents that that are involved, it's underlying technology that's not being used but communicating. So, I wasn't talking about, ah, this the generic notion of, you know, just any patent being asserted against I wasn't talking about a frivolous lawsuit. If

Jaime/Tyron: Or the whole patent troll type issue, right?

Marwan: Yeah.

Jaime/Tyron: I mean

Marwan: I was more talking about very a very specific technology that say in the hypothetical situation that, ah, the company doesn't really use and I've when I'm doing the analysis for example, eliminated those patents 'cause I knew we don't use them but from the outside world, it doesn't look like that we don't use them, and that lawsuit will come in and it's from the outside it isn't frivolous but from the inside it looks that way because we don't do it. But yet you still have to go through the motions of defending it. So, I just wanted to clarify my comment.

Jaime: Sure, sure and actually you know what Marwan? That plays right into Gene's question and what my what my response was going to be in terms of the context of, let's say, a product clearance letter. That's a really good example that really covers a lot of your points Gene and really it goes right to you to your issues that you raised Marwan. Which is I've, in my past history, I've hypothetically there might have been a major product study, you have a new product that comes out for a client and they want to get a patent clearance study. 

My personal view is that, frankly they're a waste of time because you're never going to get absolute certainty as to whether or not your product infringes someone else's patent. I've in in fact it's almost certain that every product out there is going to infringe someone's patent somewhere, and whether or not you're going to find every risk, is doubtful. So, it's really a matter of it doesn't almost you're never going to get to a hundred percent confidence, so what's the point of getting even to a 99% confidence? Because that 1% may still be the one that comes after you. So, what's the real what's the real goal behind the what the question is from your perhaps your business leader is asking that wants this clearance study?

Is it to make them feel you know, warm and fuzzy that, ah that things will be okay and that way they can use your data to go justify a product launch to somebody higher up in your organization? Or is it to really believe that they're that they're safe because of it's the ladder? That's a mistake 'cause you're never really safe. Ah, so in terms of confidence, how much? Well it depends on the context, in that context, if the ladder about the comfort, there there's never going to be a high enough confidence to really like you legitimately feel confident that, you're not going to be sued. Ah, if it's just to provide a warm, fuzzy well, you know, maybe I don't know, something more than 50% that says it's more likely than not because that's as really as close as you want to get, is that it's more likely than not. That this that this shouldn't happen.

And even when you're making litigation strategies, you can look at historically what a judge does. I know that in the Eastern District of Texas, the judges are with a very high degree of certainty, going to set a Rule 16 conference in a certain amount of period after the filing of a lawsuit without regard to whether or not an answer's ever been filed, and I can make decisions, strategic decisions based on that. Similarly on an on an investment, it it's it really comes down to the value that you're getting out. Ah, if a million dollar investment to provide some comfort in a billion dollar deal, well that's okay. Ah, ah, a million dollar investment to look in validity issues for a patent that you could license for $50,000, that's not okay, that's not a good business decision.

So, I think it it's very hard to say how much investment is enough. I think you what you really need to do, and this is what I talked in fact I had this discussion with my class at UC Irvine Law School this week, which is that you should always make good business decisions and good business decisions are different than making good legal decisions. You should make good business decisions based on good legal data, and good legal advice but ultimately you're going to have to make a good business decision.

Gene: Yeah, can you know, elaborate on that a little a little bit more. I mean 'cause that that that's I think, a critically important point, you know. A lot of times I get people that do you know, they come to you and they want, you know, "I want a hundred percent certainty that I'm not gonna infringe," and you know, and I tell them, "Look, well if that's what you want then you've got to go do something else because nobody can possibly ever give that to you. You know it's just impossible to say that you're never gonna you're never gonna get sued. It it's just that's not the way it works. And on top of that, if you start today, there's no guarantee that there's something in process right now that is hidden under the 18 month secrecy requirements before publication that won't pop out at the other end, and you know, we wouldn't want to run into some kind of a problem there." It's got to be about whatever your personal threshold, I think is, in terms of you pulling the trigger to make the business deal, whatever the deal is, you have to achieve that level of confidence. Is that's what you're talking about, right?

Jaime: Yeah, yeah. I think it's more about clarifying, I personally always use a window of 20% to 80%. So, it's never less than I never give a and there's no certainty, ah, above 80 and below 20 in my mind, based on experience. So, because you just don't know what can happen if you know, you could you could be building an entire program based on a patented technology that in the next 24 months, Congress may decide that, "Well you're not entitled to that. That patent doesn't exist anymore." And then you're powerless to do anything about it. Ah that falls into that you know, above the 80 above the 80, lower than 20 window.

Ah, it's just about getting to getting to your comfort level in terms of risk windows. So, if you advise your client, ah, or if you're the one driving the deal, something that's high risk, you can if you do a survey of the industry and see that there's a dozen competitors in the space, 10 of them have engaged in patent litigation on a regular basis, you can conclude that there's a high likelihood that you're going to get sucked into patent litigation absent having a cross license with all those competitors. That's one of that's a that's kind of a clear cut example, high risk. 

On the other hand, using an example of you're looking at the industry and you see there's never been a patent litigation by any of the competitors ever, ah, you could say that there's low risk to there being a patent litigation. There's no certainty but there's low risk. So, I think it's far more prudent to look at, ah, things from a high risk, low risk, "We just don't know" kind of category.

Gene: And let me just follow up one more one time now. Or in that last scenario, that if there would be a patent litigation, it would likely be from a, ah, smaller entity who believes that they've innovated something that you're using, rather than a competitor patent litigation which would also be something you want to factor in I would suppose. 

Jaime: Yeah, absolutely. Look, if there's if there's, ah, active MPE space, I mean if an MPE's come in and sued all your com all the competitors in the space, then, ah, you should assume that if you get in that space you're going to get sued too. But that also reads to, you know, how do you how do you lead forward the business strategy on this? I think it's always better to negotiate a deal when you have, ah, lots of exposure then more exposure. So, ah, it's going back to having the right data in order to drive a deal. I think part of that decision and strategy is to proactively go out and resolve issues that you think are high risk, before you have the exposure. Use that to your advantage, use the unknown, ah, to your benefit to negotiate a deal.

Gene: Yeah. Now Tyron

Tyron: This is

Gene: I want to bring you

Tyron: Yeah, I was just going to, ah, add a couple comments if you don't mind.

Gene: Yeah no, no, go ahead 'cause and I have a question for you too if you don't hit on it. So, your thoughts first, please.

Tyron: Sure, so I do agree with the 80/20 rule. I think that you can do 20% of the work day address 80% of the space but I do want to separate out and kind of an M&A activity where there's a lot of these business strategies. There's two different types of activities that you want to look at. One is at the company's portfolio and two is, other companies' portfolios as it relates to your companies or this acquiring, ah, portfolio or set of products. And so there's relatively, ah, not simple things but key things that I think that have a huge return. So, for example, looking at chain of title issues, ah, terminal disclaimer relationships, those types of things about perfecting the assets, ah, to make sure that you actually are getting what you have.

So, if you actually are going to be out and have a lit litigious strategy or to a competitive space and you need to be able to use this portfolio in the future to just, to protect your interests, you want to make sure that you're actually getting those assets and, ah, quite often I'm surprised that companies actually aren't confident on what they actually own, let alone the legal status issues. And so if they go to court and it gets thrown out with prejudice because they didn't, ah, correct the chain of title or something like that, those are the types of things that I think are relatively cheap to go and do but yet, I think are very important. So it's not very expensive and it's, you know, a fraction of the size of the deal that you'd be doing but I think that that's something that is important, ah, on portfolio due diligence on one category.

On the second aspect, I do agree that you're not going to be able to get to a hundred percent on it and I think you can look at the probabilities of what is actually going to be litigated or not. But one tactic that you can also think about from a cost perspective, that I enjoy thinking about this, is from a negotiating during the acquisition process, to be able to use this type of information to be able to put a contingency or hold back in escrow in case there's some kind of, ah, litigious activities. And so you can be able to roughly estimate what that, ah, damage is or what that, ah, risk could be, and not to be able to say, "What's the escrow clause?" or "What's the time period that would kick into that?" 'Cause then that mitigates some of the risk that goes into it as well, so, ah, that was just my follow on comments.

Gene: Okay, alright now, what the question that I wanted to ask you, and you did touch on it a little bit here, is you know, I know you've put together a lot of these tools and you're constantly looking at data, and how can you get information out of the data? Because I believe your philosophy and maybe you can touch on this a little bit too, is that there's this all this data that's out there and that with the tools, it's about trying to reduce the amount of time, based on pushing the relevant information forward and it kind of like rather than there being, ah, you know, an entire field of haystacks and you've got to look through each entire haystack separately, you're kind of trying to say, "No, no but your answer is really going to be here in this haystack, look over here." So, when you're looking at data, what kind of information for this kind of deal making purpose that we're talking about, which we've essentially been talking about due diligence and a lot for a variety of different reasons. What data goes into that particular haystack that people should be looking at?

Tyron: Sorry, there's actually quite a few different, data sources that you need to be really complete on this. I mean, just for example, we have a customer that spent nine months, ah, hiring an outside law firm to be able to go through the patents to look at terminal disclaimer relationships because it's not something that's necessarily provided by the patent office. You actually have to read the image file wrappers, go to pair look at the front of the patent. You have to actually look at about three or four different sources to be able to get a complete view of what the, the terminal disclaimer is and then mapping those relationships and then cascading those relationships in some cases because of how those are tied together.

And that was extremely expensive but obviously you can miss that data, so just in that small example, even that's isolated to US grants or US applications, you're looking at least three or four different sources to be able to get that information. Ah, again, the pair information, you have to look at the, ah, the front of the patent, there's inconsistencies and you have to look at the data sources that they bring into it. So it's quite difficult and even from something conceptually simple as a terminal disclaimer, how that relates to actually in practice, how do you merge those data sets together?

Now let's take chain of title and ownership issues, ah, you have interdependencies between family members who not only do you have to look at, you have to build it from a tree so to speak. And thinking about from all the assignments, now is there transfer from the inventor to the company? Ah, were there any securitization issues that came back? Were there reassignments that transferred over? Ah, were there prosecution events that could have encumbered any of these issues? and there's a number of these different things, so depending on how specific you need obviously UCC provisions get quite specific on how you perfect those assets but the diligence that goes into the US, ah, assets alone are not even there.

But then you get into like continuation of parts as I've mentioned, you get into international assets, ah, like in new world, ah, you have issues around taking actions, ah, from a company name that's not on the priority document, ah that's there and that gets you into different troubles, and that was a ruling in 2010. So, there's, you know, quite a bit of those pieces but even that, just chain of title aspects I think takes about 10 to 12 different data sources to piece that together, and it's little bits and pieces from each of those sources but it's quite a challenge.

There's a we had a customer that had this did this major presentation or we did a presentation once and they said we were talking about the RFID space and we highlighted some, ah, activities. So, in the assignments you also had these intermediaries that'd go through to try and hide where they end up going because a lot of times you don't want to actually be able to see the entire chain. So, we highlighted that this company that's using an intermediary had actually divested a set of their portfolio and, ah, gave it to this other company. And this company came up later and was kind of frustrated that we had discussed this and, ah, you know, we highlighted that this is all public information and we didn't understand why that there was a problem.

And the person said that there was a huge difference between publicly available and publicly known. And I think that underlies the, ah, philosophy that there's lots of data out there, just knowing where to look to be able to piece it together and how it brings together to tell the story, is a massive advantage if you're able to put it together but there's so many pieces and so many ways that you have to weave it together. Ah, I think that's really the core of this bad data, is it encapsulates not only the bad quality of the data but how it gets pieced together, and how you look for certain issues in, ah, parts of the transaction.

Gene: Yeah, yeah. And that gets back to, you know, what maybe what Marwan started us off with, is it's the incomplete nature of certain data sets that can be really problematic. I have a question here that's come in that I'd like to get ah, it's probably Jaime probably most directed at you but if either of you other guys have any thoughts on this, I'd like to hear your thoughts on it as well. Ah, and just to give you the context of who asked the question, without, you know, divulging, it does come from a university. Okay, so this is, ah, somebody that a researching developer and owns a portfolio that's been licensed out. So, they asked, "Would, ah, any of the fault of the infringement in the hypothetical we're dealing with, be on the previous owner of the patent or are they only on the current owner of the patent at the time that's of the lawsuit?"

Jaime: That that's an interesting

Gene: That's a big issue for people like you know, those folks that are researching developers that are coming up with this, and then selling it or you know, what have you.

Jaime: Yeah, it it's an interesting question and I think that the only part that, that gives big risk on the seller, is the willful infringement part. If you were if you were to have a patent assertion against you or if you've received, ah, letters regarding, ah, someone from making threats regarding a potential infringement, I think it’s incumbent on you to disclose that to disclose that as a potential risk or liability to the purchaser. you know, not unlike sell selling a house, ah, with a sinkhole beneath it, ah, and you put a rug over it, ah, ah, so that the buyer can't see it. Ah, you can't you can't simply hide risks, ah, that would be a bad faith in dealing.

Gene: Yeah, and I also think a lot of it so much will also deal with the contract itself that'll be entered into between the parties, isn't that really correct right? I mean who accepts risk moving forward?

Jaime: Well I think I think even if on an as is sale, I think if you if you have a an explicit threat, ah, regarding patent infringement, I don't think and the only people that are going to know about that are you and the party that made the assertion, I still think that that's bad faith, ah that puts you at risk if you don't, ah, disclose that. I don’t think you have to disclose that there's or you don't necessarily have to disclose everything you've thought about, about potential parties out there because those aren't real risks and that's incumbent on the purchaser to do their own analysis. But if you if you have a real threat and a real risk, ah, even in an as is sale, you need to you need to disclose that.

Gene: Yeah, yeah, I would agree with that. Marwan, I want to

Tyron: You know, one interesting thing that sorry

Gene: Go ahead.

Tyron: Sorry, I just want to make a quick comment. with markings and being able to put people on notice, one of the nice, ah interesting things I don't know if anyone come about, is if you put, ah, post to patent product associations so if you're putting people on notice, even though that they haven't necessarily received a letter, in theory you can be able to claim for backward damages and so even though if they haven't received any kind of, ah, letters that by putting this as a marking notice, they theoretically they're able to claim for backward damages on your site. So, interesting in this transaction if that would be affable or not for some of the backward damages from the previous company.

Gene: Now Marwan, the issue we're bumping into here at the moment I suppose is

Jaime: Gene, can I comment? I want to comment on that you know the risk assessment if I may? A quick comment.

Gene: Yeah sure and that's where I really wanted to go with this. So let me just quickly get this out and then you could wrap this whole thing together because it strikes me that what we're talking about here at the moment is maybe not getting the best information from, the seller or you know, the person who is, you know, selling their company or selling their portfolio. And you as the person who's trying to provide the information back, that's got to be a real problem I would assume for you in in this whole process.

Marwan: Yes it is, and this goes back again to, understanding, what the client is trying to, use the data for. if I give them the wrong data it might be correct but if I give them the wrong data, to me that's still bad data and they make a decision on you know, using it because I didn't understand, ah, what the analysis, they're asking for. and you know, to both Jaime's and Tyron's point, it it's all risk assessment, ah, there there's no way you're going to cover the entire, ah, landscape. I like the 80/20%, ah, rule, you know they discussed, I never, you know, consciously, put it in in those terms but it is about right. and you know, it's also, you know getting the good data, making sure there's good data that that's coming forth, is also how you do, ah, the analysis.

And again, I'm on the technology side of all this, I'm not an attorney, so and I'm not a designer of a tool like Tyron is. I'm the I'm a user of, ah of that tool, ah, so, what the first thing I do with a with a question if you will or an analysis, is I actually turn a blind eye to, many of the things that Tyron mentioned, you know, the chain of ownership and any of that. And I look at strictly at the technology itself. Who owns it? How it’s owned? I try to, ah, ah, ah, filter that into the answer the first answer to the question. And then from there on, ah, the different the different layers that, are different layers of analysis can be put on top of it but to me, it's very, very essential for a technology company a technology acquisition to actually understand the technology landscape.

And, I don't mean to put a plug in for Innography but I will because it's how I use Innography because of the existence of tools like Innography, and I've been using it for about seven years now, ah, I'm able to actually do a heck of a lot more analysis, on an for example, the entire USPTO, ah, ah, database, and actually filter it down to the data I want to satisfy some of the, ah, requirements that somebody like Jaime has, ah, in order to make the decisions. So, ah, I think we can do a lot more analysis at this day and age because ah, of where we're at, ah, ah, technology wise.

Gene: Okay, we have another question that's come in guys and, it's specifically asks us to talk about Alice and the question is, "The Alice versus CLS bank decision has caused a lot of confusion with software patents, which still hasn't been resolved" which is probably an understatement. "Based on that uncertainty, and the validity of software patents generally speaking, how much due diligence, if any, is appropriate when you're considering acquiring software companies?" Or I'll throw in there, or maybe acquiring software portfolios? Does anybody have any thoughts on that?

Jaime: Yeah, this is Jaime. I it’s, good timing so we funny, we talked about Alice, ah, and software patents in my, ah in the law school class this week, specifically. And I that the court, ah it's cleared the federal circuit has issued some decisions that give far more hope to software patents, including like Enfish, with the affect that you just I think there's due this is tied to the question, you could definitely do due diligence on a patent portfolio for a software company, ah, in terms of to be able to reach at least a warm, fuzzy. You're not going to get no certainties, warm, fuzzy as to whether or not they have patents that are likely to withstand an Alice.

So they could caveat that that's incredibly dependent on what panel ultimately decides at the at the federal circuits. And some panels would have Enfish is a great example, I think some panels would have found the exact opposite. Ah that they were just so antisoftware patents and they never would have found it to meet the Alice requirement. But with software companies, it presents, ah a more interesting issue is, had the company made a decision to put all its eggs into a patent protection basket or is it relying, ah, on copyright and trade secret? Which frankly are more valuable often times for a software company because it depends on what stage of the software they're having their product. So they do they have compiled source code that they could protect by trade secret? Do they have, ah, copyrights or is it is their is their invention less the code than the application of that code? So it really depends on what the software company's bread and butter is.

Gene: So the typical lawyer answer depends and I guess I even totally have to agree with that you know. Looking at all this, I think you're certainly better off on the patents here today under Alice, so although it's still very, ah, uncertain. You know, if you look I think there's 11 judges on the federal circuit that would in the right situation, vote for, ah, claims being patent eligible. that leaves a handful of them that would vote no and certainly Judge Dyk and Judge Mayor are absolute no votes. So if you draw a panel that has both of them on your case and you're hoping to say that you have something that is patent eligible, you have lost. and the P tab, the P tab is the P tab, they don't seem to like patents very much. Ah, particularly the CBM stuff which is just, you know, mind boggling to me. That's another webinar for another day. So there is a lot of uncertainty so I think you really have to look at these each one of these deals really generous, to the extent that you're not getting a sweetheart deal.

You know what I mean? Some of these portfolios are being sold for pennies on the dollar. Some of these patents are up for sale for less substantially less than the cost of acquisition. There's just even the prosecution costs. So maybe you're taking a flier in that case and if that's the case I think going back to what Jaime was talking about earlier, you know, if it's a billion dollar deal, a million dollars to assure yourself is makes sense but if it's a $50,000 deal, you're not going to spend a million dollars. So if you're taking a flier and you're getting a sweetheart deal, then you're probably not doing a whole lot of due diligence, maybe you're just doing a quick look to see is there does and may maybe you're even weighing the patent, you know, you're just flipping through it to see does it have any kind of technical description at all that you think maybe could stand the day. I don't know. 

But I'm looking at the clock, I just noticed, this con conversation's going so smoothly, I've lost track of time. We're down to the final two minutes, so let me call an audible here because I want to give each of you guys the opportunity to wrap this up. And, let's start with, ah, Marwan an, ah, then we'll go to Tyron and Jaime, in order. And what I'd like you to do is, ah, you know, in in sort of a rapid fire, you know, 30 to 60 seconds, what one thing do you want the audience to leave with today? What piece of information would you like them to leave with today, about the topic we're discussing? In terms of bad data, bad decisions. Marwan.

Marwan: I think the main thing is that, ah, the bad data can be, mitigated in so many ways. and I'm echoing some of what Jaime said, is the data is out there, it's a matter of using the tools to and doing the proper analysis, ah, to get to it but you only you know, it's at what point do you do you stop? And that's where you come up with that, ah, you know some kind of confidence ration but a lot of the bad data, that is being used for these decisions, ah, a lot of it can be mitigated by, ah, ah, proper usage of tools and analysis.

Gene: Thanks Marwan. Tyron.

Tyron: I think the takeaway for me is, ah, not all data is created equal and so when you look at things like company normalizations or you look at some of the stitching of this data together, it's easy to say it's easy to have a check box but there's so many details behind it and so many depth and quality and legal questions that go into it. just making sure that you actually understand what's going on and put the diligence into the confidence of how that's being approached because, you know, a car is not a car, you have a huge range of different things that go into it. So, making sure that you have the confidence and how the data is processed and because that obviously impacts your analytics and decisions, ha, and conclusions for what you do with it.

Gene: Yeah, if you're building a house on a faulty foundation, the house is never going to be worth anything. Jaime, your final thoughts.

Jaime: Alright, three words, that's it. Don't be lazy.

Gene: [Laughs], oh I love you, Jaime.

[Laughter]

Jaime: But it's true, it's true. The data's there, read all the point that everybody had, the data's out there, don't be lazy. Make a get enough data to that you in your professional, ah, ah in your professional experience, gives you enough confidence to be able to give advice to your clients and don't ever give certainties.

Gene: Yeah. Alright, well thank you panel, and thank you all for attending and certainly if you are interested in data solutions, I invite you to check out Innography's it's an it's a wonderful. We use it at IP Watchdog, ah, we get a lot out of it, and I think you'd get a lot out of it. I I personally think it sells itself if you try it. So, if you're in the market for it, give them a call, and thank you. We've got a few more webinars already in the works over for over the next few months. Some great topics of discussion, so we look forward to you joining us again, ah, in the near future. Ah, thanks everyone, and have a great day.

[01:01:41] [END OF AUDIO]

Catch a Future Webinar OR Watch More Videos




Leave a Comment