Nasir Pasha & Matt Staub

Can Companies Protect Against Foreseeable Misuse of Apps [e285]

Nasir and Matt discuss the suit against Apple that resulted from a car crashed caused by the use of FaceTime while driving. They also discuss how foreseeable use of apps can increase liability for companies.

Transcript:

NASIR: Hi and welcome to Legally Sound Smart Business!
I’m Nasir Pasha.
MATT: And I’m Matt Staub. Two attorneys here with Pasha Law – offices in California, Illinois, New York, and Texas.
NASIR: Welcome to the podcast for a little short break we had between our last episode and this is where we discuss current business news with a legal twist.
Today, we are talking about how developers – especially mobile app developers and other business owners in that space – can deal with possible liability for their software apps and their products. In this case, it’s a sad story of an Apple app that actually caused, well, it’s alleged to have caused a deadly accident.
MATT: Yeah, let’s do a little bit of the background here on how we first started discussing this.
There was a car accident back – actually, on Christmas Eve in 2014 – in Texas. The reason this is just kind of surfacing now is they just filed the lawsuit here, two years. I assume that’s the potential limitations. They had to get that in in time. But there was a car accident with a driver – a 20-year-old – who was on his way to visit family on Christmas Eve. He was – for whatever reason – using Facetime while driving. For those of you who aren’t Apple or iPhone users, Facetime is basically a video call. You’re able to see the other person and vice versa when you’re doing a phone call. Kind of like Skype, I suppose. He’s driving on the highway and wasn’t necessarily paying full attention. He crashed into the car of this family and – you kind of alluded to this – the sad story behind this is it resulted in the death of the family’s 5-year-old daughter who was in the backseat. There was actually four people in the car – the parents and the two kids. I believe all were injured. I believe the other three were injured and then, of course, the unfortunate death of the 5-year-old. That’s kind of the backstory of what happened here.
So, what happened has happened since. The family is now suing Apple, essentially alleging that Apple is negligent and not having any sort of safeguards that would restrict the use of Facetime while driving.
When I first heard that statement, I thought, “Well, I don’t really know about that,” but the interesting thing about this is Apple actually had, as early as 2008, when they filed for this patent and developed this technology. They call it kind of lockout technology. It would basically lock out the driver from using certain apps while operating the vehicle which, now with that knowledge, I didn’t know if that was something you would come across or you knew about but, when I found that out, I was like, “Oh, well, that makes this case a lot more interesting.”
NASIR: But I have an inherent problem with the argument because, you know, just because you filed a patent, you have an invention, there’s commercialization – and this is kind of off-topic – commercialization of an invention is a lot different thing than actually inventing it in itself. Like, I can invent a machine that travels through time but because it costs a billion dollars to make, I haven’t been able to actually build it yet. And so, I’m looking for funding today.
The point is, you know, just because they invented it, I can still file a patent but that doesn’t mean they can actually implement the software. For a practical example, maybe they had the technology but maybe the processor speed of the phone isn’t fast enough to be able to quickly detect if you’re moving or whatever reason. Alone, by itself, assuming that argument is valid just because they have a patent doesn’t mean it’s a viable option to protect the user.
MATT: Well, yeah, with your invention, you could essentially just be sued for whatever bad thing happens then because why didn’t you just employ your time machine? You could have prevented it. You could have stopped these bad things.
NASIR: What’s nice is that if I do get sued, I can always go back in time and prevent that from happening.
MATT: That’s true. You probably have been sued a ton and we just don’t know about it. That’s a good point.
NASIR: Yeah, I’ve changed the future.
MATT: You make a good point. Just because they’ve developed, they filed for this patent back in 2008 doesn’t necessarily mean that they should have had it into the phones today and they haven’t been usable. I’m not any sort of software engineer but it seems like there might be some difficulties in being able to determine if somebody is driving the vehicle while using it? Who’s to say it’s not somebody in the passenger seat or behind them? Something like that.
NASIR: That gives a good example of Waze which we may have talked about in the past because Waze and Google had some data exchange and so forth. Waze is like a Google Maps app but it’s specifically designed for getting from Point A to Point B. it’s like the navigation aspect of Google Maps. Again, we’ve talked about it but one of the cool things is it crowdsources information so traffic, et cetera. But, when you open the app, the first thing it asks you, if you’re trying to go to a certain destination or you’re trying to type something in, it asks you, “Are you a passenger or are you a driver? If you’re a driver, then you really shouldn’t be using this app and you can’t use this app unless you’re a passenger or you should be stopped before you can use it. It actually has a preventative measure but it doesn’t detect if you’re a passenger or not but it does detect if you’re driving.
MATT: It’s like when you go to a site for a brewery, they try to see some information and they ask you if you’re 21 or not.
I have a pretty good example. With my car, if I’m driving and there’s certain things I can and can’t do on my little module upfront. It does give me a warning. If I try to enter directions while driving, it does pop up with a warning saying, “You shouldn’t be doing this.” You just hit OK and I’m allowed to do it. There are certain things that it does lock me out of. Actually, probably entering directions is probably the most dangerous thing I could be doing on there. I don’t have TV or anything like that so it probably is the most dangerous thing and I think this is kind of in the case too. Where do you draw the line with that? How do you even know?
In this case, this 20-year-old, how would they even have known he was driving? Just the idea of using Facetime while driving seems a little bit questionable. It’s almost not even believable.
NASIR: Let’s go back to the lawsuit. It’s kind of the implication that everyone’s kind of drawing that, okay, “Wait a minute, Facetime, everyone realizes that you shouldn’t be using your phone and there’s inherent risks,” but I just want to note that and you mentioned in the beginning of the intro that they actually waited a couple of years before filing this. In fact, they filed on December 23rd and this accident happened on December 24, 2014 and most likely because of the personal injury and products liability statute limitations is two years. I only want to note that because it could give us indication that it’s not like these guys immediately took to a lawsuit as their first reaction. If you start thinking about it, there’s some viable argument. Obviously, they lost a child which is horrific, of course, and I’m sure they’re willing to take the criticism that, “It’s your fault. You have something to do with it. you were using your phone while driving.” But that’s not their point. Their point is not that phones are inherently dangerous and they should have all these protections. It’s that because of the technology and the nature of phones and because the technology was available, there should be something that Apple should be able to do to prevent these accidents from occurring.
MATT: Yeah, and that’s really – like we talked about earlier – that’s really what it boils down to. Where should the line be drawn on what Apple’s responsible for here? Even in the best-case example of which Apple has this technology. It works flawlessly and it’s relatively cheap for that, should that be something they’re required to have? I mean, people are going to make their own decisions.
I’m trying to think of another example. It’s not like, if you bought a gun and you pointed it at somebody, something would pop up saying there would be a lockout on it and say, “You can’t shoot somebody. That’s murder.” That’s a bad example but…
NASIR: It is a bad example but still funny.
What’s interesting though, what we can get out from what you just said – which was, again, very funny – is that there’s actually already a whole area of law that is applicable to this situation. People may think, “Oh, this new area of law where apps all of a sudden cause accidents and where do you draw the line?” But, simply, it’s products liability and that’s actually how they sued. You know, a gun is a good example, but let’s think of a tool – any kind of tool, like, a chainsaw. Most chainsaws are designed so that you pull the handle and it starts going. But, if you let go, it’ll stop so that somehow the chainsaw gets loose. That’s a safety feature or there will be a guard between your hand and the actual chain. These are things that are protective measures.
Now, if there wasn’t a guard and you actually cut yourself because of it and have some kind of dangerous accident, the result would be, “Hey, this technology of having a guard here was available, why didn’t you put it on here?”
It’s the same kind of legal theory that we can analyze as it applies to this app and any other app for that matter.
MATT: Just to kind of flesh out the rest of the lawsuit because we’re going to touch on what you just discussed, of course, negligence and things like IIED and the like, we’re talking about, in this case, we have a product – this iPhone – and what I think they’re arguing is the design was inherently dangerous in that it allowed this individual to use this dangerous thing while operating a vehicle when it shouldn’t have. This is obviously not a clear-cut situation. Your chainsaw example makes a lot more sense and that’s something someone can probably decide on pretty easily. But, in this case, I don’t know. I’m personally just having trouble agreeing with that necessarily. I can a little bit and I think, unfortunately, the death result probably factors into some of it which it shouldn’t.
NASIR: Yeah, anyone and everyone can sympathize and it is difficult to separate that sympathy. If you want to look at it objectively, this is how the court is going to analyze it and how they should analyze it. For a products liability case, most cases are reviewed in most states as a strict liability perspective – meaning all they have to show is that there’s either deign defect, manufacturing defect, or inadequate warning. Now, each of those categories requires its own kind of test. Let’s just kind of go one by one really quickly here. The design defect is probably what they’re going for here. What they’re saying is that this phone is inherently dangerous on the road. And so, by design, it is inherently dangerous and so the design itself needs to be corrected. That’s their argument. We’ll go back to that in a second.
A second category is this manufacturing defect. They’re not necessarily saying that here. They’re not saying that the phone itself, that particular phone that was of issue malfunctioned because it actually did exactly what it was designed to do. If the phone exploded like the Samsung Edges, that would be a manufacturing defect because they’re not designed to explode but they exploded and caused an accident.
The last one – inadequate warnings. This is about whether or not the prospective danger that occurred or result or the accident, is that an obvious thing that is just obvious? The gun example, they don’t need to put a warning that a gun will shoot a bullet out and hurt something that gets in front of the bullet. That’s obvious. But is it an unpredictable result? Is it unpredictable that, somehow, when you’re driving and someone calls you on Facetime, it may distract you and cause an accident. That may be what they’re going for too because, in theory, they could say, you know, “There’s a warning here. Be careful when you’re using this on the road.” It may not be as obvious. And so, we can come back to that. What do you think?
MATT: Yeah. I mean, we can talk about the manufacturing defect. I don’t think that’s going to come into play. Inadequate warning, you know, it’s better. I don’t think it’s perfect by any means. I think what they’re going to have to add on is the design defect aspect of it.
NASIR: Yeah, because most people know that any kind of distraction on the road is dangerous. Even talking to somebody next to you in the car takes your focus off the road, you know?
MATT: Yeah, and one thing that I didn’t even really consider is that I guess – I don’t have the complaint right in front of me at this second, unfortunately – do you know whether the driver was holding? I know they didn’t sue the driver but do you know whether he was holding the phone or was it in one of those mounts on the windshield? I don’t know if it said that or not.
NASIR: Let me see if I can find it and I’ll answer that question. But, of course, Texas doesn’t have that hands-free or those phone laws that California does.
MATT: Yeah, no one follows them here. I don’t know why. Ten years later.
NASIR: Well, now, California just changed its law to make it even stricter, right?
MATT: This is with more cars having Bluetooth built into the car anyway. This isn’t a lawsuit against this individual. It’s against Apple. I was just trying to get an idea. It kind of factors into the analysis, I think.
NASIR: And so, why I think that they’re really focusing on the design defect is because they talk about it, they go through this whole section about how, in fact, it’s titled, “Smartphone Compulsion/Addiction” and they kind of make the argument that, “Look, phones are so addicting and everyone’s so attached to phones that even if you put a warning or whatever, that’s not enough.” You have to design the phone so that it’s safe and so that, if they get a call or a Facetime call – because somehow a Facetime call that has video interaction is much more distracting than a phone call which is I think an obvious argument and it’s a fair argument – that they needed to prevent that somehow.
MATT: We’ll link the copy of the complaint because it’s very artfully drafted, as you said. Yeah, it’s essentially saying, “Look, so many people don’t even have the will to not use it. They can’t even control themselves so you do need to design something that can control these people.” Again, at some point, we have to draw the line somewhere – or at least, to me, it’s like what’s a reasonable expectation? I know and understand that’s not necessarily how the law always works – maybe for negligence, sure, but it might be a little bit different here.
NASIR: If you’re using Facetime at all, whether you’re holding it in your hand and having it on a mount, I guess it is a subtle difference but not much of a difference, right? I mean, that’s not the best thing to do.
MATT: If you were using it, there’d be a difference between using three of your senses versus two if you’re holding onto it.
NASIR: Yeah.
MATT: Just a conversation in the car can be dangerous enough and that’s just speaking. Actually, I’m wrong. It would be four senses or three because you’d be speaking, hearing, looking, and then holding. I mean, Facetime, I just don’t know how someone could even use that. First of all. I don’t see how someone could even use that and then also drive. And then, two, not that this matters but, on the other end, is this person just watching him drive? I don’t really get it. It’s getting off-track but we always try to throw this out here and what do we expect to happen. This is a tough one.
Obviously, Apple’s got some deep pockets.
NASIR: Yeah, and it’s a heavy lawsuit, too.
MATT: Right.
NASIR: And most of these things often do end up settling because of the nature of it.
MATT: Yeah, I mean, Apple doesn’t want this going to a jury, that’s for sure.
NASIR: Yeah, and also creating precedence and all that. So, a settlement is definitely the best. It’s just a matter of what can they get the family down to and what’s acceptable, et cetera. But it’s important to understand that we’re talking about the Facetime app which is basically, all it is is a Skype-like video conferencing app which we talked about.
But there’s so many other apps out there that have been an issue of controversy or maybe apps that are going to become them and the first one that came to mind was Pokemon GO. Someone classically – I don’t want to say “classically” – but unfortunately was killed when using the app. I’ve never used it but they were using the app and walking across the street and somehow got struck by a car. Of course, that could happen just by using any app but, specifically, when it’s Pokemon GO that you’re immersed in your phone mixed with reality and so you’re actually navigating and walking around with your phone. It’s a little bit different but you can apply this to pretty much every app, right?
MATT: Yeah. You know, it wasn’t just a one incidence. I’m sure there’s probably sites – websites even – dedicated.
NASIR: Just dedicated.
MATT: Yeah, just all these different…
NASIR: I mean, trespassing issues because people would try to go into people’s backyard to catch some Pokemon and things of that nature. And then, there’s also apps like, for example, the fitness tracking apps. I mean, they have it so that they recommend you to take a certain number of cups of water per day. Wherever they came up with that standard. But the point being is that I can definitely see how those kinds of suggestions – especially if it’s medically related – can lead to some kind of cause of some negative result. Of course, your argument is going to say that you need to have some kind of warning.
By the way, when we talk about warnings, the warnings can’t just be built into this long terms of service. If you are required to have a warning because of some kind of unpredictable dangerous result – like, some warnings can be ridiculous too and the most classic example I use, because when I talk to clients, I go, “You know, there’s a standard of when you need to put a warning,” and you can always be very conservative about it but the most ridiculous one I’ve ever seen was back in the day when we had CDs and DVDs. I had bought a DVD rack and it literally said – as a warning, with no exaggeration – “Do not use this rack as a ladder.” You have to see it though because it was literally two feet tall and it was just like, you know, very comical, of course.
MATT: You wonder whether the lawyers were really racking their brains or whether it was more of a response to something that actually happened.
NASIR: And that’s what they say but it would have made more sense if they maybe don’t use it as a stepping stool, but I guess ladder may cover everything. But, still, is there an unpredictable result that you’re warning against – that somehow you shouldn’t be using this device as a ladder? And, out of all warnings, you might as well say, “Don’t swallow this!” or “Don’t eat!” or “Don’t throw at people!” It’s just as obvious, right?
MATT: Yeah, I agree with you. I mean, obviously, with all of these, when we talk about apps, it’s going to depend on what the app is designed to do and how someone might be using it. I’m trying to think of a basic app – a calendar, for example. The worst case is someone could use it as I suppose is using it while driving, for example. But I think the Pokemon GO is good. Any of the apps that have any sort of navigation or direction or GPS or tracking where you’re at.
NASIR: That you’re likely to use while driving.
MATT: Yeah, that’s really when it starts expanding the potential of exposure for liability there. You’ve really got to think about it and how people go about it. It might not even be the worst idea, I mean, I’m sure a lot of these do testing but maybe even kind of throw it out there to everyday people, you know.
NASIR: But a simple evaluation from an attorney, if you have them on retainer or whatever, a simple conversation thinking about what the risks are of the app. Most of the time, like you said, if it’s a simple app, it’s going to be obvious and you’re not going to need anything. But, especially if you get into an area where the app is unusual and especially people are going to start using it and that’s usually what happens because, for example, Waze didn’t use that warning in the beginning and I used it years ago. It was just regular. It didn’t even have any kind of warning or terms. You just started using it. But now, literally every time you open it, you actually have to go through that process. And so, that was a course of learning for them.
MATT: Yeah, and I’m sure there’s good reason behind that. We can always say to be proactive about it but sometimes it is reactionary. It might not even necessarily be your company or your business. It could be reactionary based on something else that happened. What do you have? Samsung?
NASIR: I have a Samsung Galaxy.
MATT: You’re also driving right now.
NASIR: Yeah, I’m driving. I’m actually taking out the case because I forgot. Galaxy S7.
MATT: I have never had one. Does it have some similar sort of Facetime? You might have Skype on there, I guess.
NASIR: Actually, you know, they do. I used to use WhatsApp for video but now, if you’re calling two Samsungs, you have the video app. I don’t know what they call it – probably just video.
MATT: My point was there are other companies – competitors of Apple – might have seen this.
My point was sometimes you do see other things that happen that makes you think twice about it and it wasn’t even your problem to begin with. I’m not giving that as good advice but sometimes you’ve got to be open and think about things like that.
NASIR: Facetime is pretty neat if you think about it. I just want to say that because the whole video conferencing on the mobile phone is just awesome.
MATT: I actually got a Facetime call today from a friend of mine but I think it was an accidental call.
NASIR: Oh, that’s dangerous.
MATT: I don’t have any idea why he would be Facetiming me.
NASIR: Was it while you were driving?
MATT: I was flying a plane so it’s not the same thing.
NASIR: Oh, okay, it’s not the same. I don’t think there’s any laws against that in California.
MATT: Planes operate themselves.
NASIR: All right, very good. Thanks for joining us everyone.
Looking forward to our next episode which is going to be on the topic of… and then what we’ll do is I won’t say anything and we’ll just fill it in later. I don’t know what the topic is going to be.
MATT: That’s good.
All right, keep it sound and keep it smart.

Play

Read More