In today’s Omni Show episode, we conclude our two-part privacy special by welcoming the Electronic Frontier Foundation’s Executive Director, Cindy Cohn, into the conversation. Ken, Cindy, and Andrew chat about why you care about your privacy and user agency as it relates to iOS’s recent update (Click here to catch up on part one).
We then broaden the conversation to include techniques and tips you can use to better protect your privacy as you surf online.
You can find out more about Omni’s ongoing commitment to your privacy in our brand new privacy page.
Find out more about the EFF’s amazing efforts at protecting your digital rights here.
Some other people, places, and things mentioned:
Andrew J. Mason: Cindy, what does it mean to solve some of this at the systems level?
Cindy Cohn: When you buy a car, you don't then have to go and look online to find the right kind of brakes and evaluate them and find your way through to try to figure out what are the right brakes. Your car comes with brakes. Technical tools need to be the same way and that they shouldn't outsource all of the hard technical thinking about the right choices to you, and then leave you with very little tools to do it.
Andrew J. Mason: You're listening to the Omni show. Get to know the people and stories behind the Omni Group's award-winning productivity apps for Mac and iOS. My name's Andrew J. Mason. And today we talk with Cindy Cohn, Executive Director of the Electronic Frontier Foundation, and Ken Case, CEO of the Omni Group for part two of our two-part privacy episode special.
Andrew J. Mason: Welcome everybody back to part two of our two-part privacy focused episode special, partially in celebration of Apple's privacy focused updates for the iOS. Like I mentioned, my name's Andrew J. Mason. And in addition to Ken Case today, we also have Cindy Cohn.
Andrew J. Mason: I mentioned she's the Electronic Frontier Foundation's Executive Director. From 2005 to 2015 she also served as their legal director, as well as its general counsel. In 2018 Forbes named Cindy as one of America's top 50 women in tech. And we have Cindy here to help schools in all sorts of things, privacy issues. Cindy, thank you so much for joining us today.
Cindy Cohn: Thank you so much for having me.
Andrew J. Mason: My goodness, it's our honor. And like I mentioned, we also have Ken Case in the house with us today. And Ken, before we get started, I would love for us to just kind of dip back into that part one episode. And can you just share with us what is Omni's official stance on privacy? Anything that comes to mind with how Omni likes to conduct itself when it comes to all things user privacy?
Ken Case: Sure. Well, first I just want to say how thrilled I am that Cindy was able to join us on the show today, that EFF has done some wonderful work through the decades. And you know, I'm very pleased that we've been able to have this program. We really appreciate all of the work that your team has done Cindy.
Ken Case: So over the years, privacy has been a very important element of the computing experience. One of the most exciting things for me as somebody who grew up with computers in the '80s primarily. I used them a little bit in the '70s, and saw the network start to blossom and then become available to everybody in the '90s, it's incredible to have these tools that you can use to communicate with each other, but it's also interesting and dangerous how that communication can be used for surveillance and for some sort of dark patterns out there.
Ken Case: And the EFF has been on the forefront of this now for the last 30 years, helping explore this frontier and thinking about what should policies look like to ensure that the future of the network is bright rather than dark.
Andrew J. Mason: Perfect, beautifully stated. And then Cindy, for those that don't know, what is the EFF?
Cindy Cohn: So the Electronic Frontier Foundation is 30 years old. We predate the worldwide web and we were created by some visionary folks who really wanted to make sure that when the world went online, our rights would go with us and recognizing that there was going to be a need for an organization that understood how tech worked at a very deep level, and that was there in the policy and legal, and really helping creators as well to recognize when rights are at stake with how a tool is designed and how it's used and how the law applies to it, and really stand up for user's rights.
Cindy Cohn: We've been called pro-user zealots by a Canadian regulator. And I think she thought that she was saying something bad about us, but we put it on a bumper sticker, we were so proud of it. So our job is really thinking about the user and the user's experience, but also the creators because we're all geeks. We love cool tools and we want tools that are designed to serve us rather than to serve us to advertisers or other people. We should be able to have all the cool tools, but also have those really work for us.
Cindy Cohn: So we've been involved in cases around the first amendment and free speech, around privacy, around protecting innovation and your right to stand on the shoulders of giants and build your tools based on the tools that came before. Lots and lots of issues around these things, but all kind of centering around making sure that we're building a digital world that we want to live in.
Andrew J. Mason: Cindy, I'd love for you to take a second and frame the importance of privacy within the minds of an average tech user. You know, so many people think things like, "Well, I use passwords, so I'm good," but where do you even start to say that's not necessarily the whole ball game here?
Cindy Cohn: Well, I think most people kind of intuitively understand about privacy and sometimes it's this getting it to apply to the digital world where things are a little less clear. I think most of us, all the way from the lowest level, like most of us shut the door when we're in the bathroom to, most of us have doors on our house that shut, that we all need this space where we're not being watched to be able to do things.
Cindy Cohn: And I think that's tremendously important in the digital world as well. In the '90s especially, but even today you find people in marginalized groups or who live far away, using the internet as a place to explore new ideas and new ways of thinking and new things. This is hugely important for the LBGTQ community to be able to have a space where you're not being watched, where you can try out ideas and do things.
Cindy Cohn: And this is important in personal ways as I talk about. But also as a political matter. If you want to change your government you need to have a place to be able to have a conversation about what you want to see changed. And that place has to be private, otherwise change doesn't happen. You see this in authoritarian societies where we see that people's inability to have a private conversation really gives power to authoritarian governments.
Cindy Cohn: So, whether you're at the kind of hyper-local small in your individual life or the big picture questions about how we're going to govern society, privacy matters at every level.
Andrew J. Mason: Perfect. And that brings us to why we're here today to talk about the iOS 14.5 update. Apple's done some pretty significant things here from a privacy standpoint. Can you maybe walk us through some of that?
Cindy Cohn: So Apple is creating a requirement that apps tell you what they're doing with your privacy, and I'm going to get the specifics wrong, but essentially let you opt out of third-party tracking. In privacy, we think about first party tracking, which is the company or the entity you're directly interacting with. But a lot of the advertising business model works on third-party tracking where the company that you're interacting with is handing your information to somebody else who's doing a lot of the tracking.
Cindy Cohn: Apple is shifting so that when an app is going to do third-party tracking, it makes you opt in instead of opt out, and it makes you specifically give permission for that tracking. So it's putting you in control of the third-party tracking that happens over apps, which has not been the case.
Cindy Cohn: My view is it always should have been the case. You should have been in control, but much of the Internet's infrastructure around apps especially, is really not giving you control or giving you kind of fake control. And I think what Apple is doing here now is really putting users in the driver's seat about what kind of third-party tracking they want to approve.
Cindy Cohn: So to me, it's a straight user empowerment move, and we think Apple's right to do this. I mean, we wrote a blog post about this in late December, and it ended with, "Your move Google."
Andrew J. Mason: Ken, as a developer, working on these projects, what sort of things are Apple asking you to disclose or share?
Ken Case: They, of course are very strict about the disclosing third-party tracking, but they're also asking app developers to disclose their own first-party tracking. And I think that's wonderful. For example, some of the information we disclose about our apps are that you sign into our apps, we know who signed in and we know what you purchased, and that's how we can determine what features we should be presenting to you in the app because you purchased them.
Ken Case: And that's the level of tracking we do. And we now disclose that through our app store labels.
Andrew J. Mason: Cindy, I remember a quote from that blog post you're mentioning about something to the degree of requiring a tracker's consent before stalking you across the internet should be an obvious baseline, and we applaud Apple for this move. To you Cindy, and then also I'd love Ken, your input on this, how big of a step in the right direction is this?
Cindy Cohn: Well I think it can be a very, very big step and I really want to applaud Apple for doing it because the app store is so important. We have concerns about the fact that we live in a world where there are two big app stores and you either sign up for one or you sign up for the other. I think in the long-term, that's probably not wise, but if you're going to have that kind of power, using it to empower your users is the right step.
Cindy Cohn: Again, as I said, I hope that this is the start of something and not the end of something. It's still big, even if it's just Apple doing it because they're such an important player. For us, every step leads the way to the next step towards empowering you. And there's certainly next steps that can happen, but I don't want to underestimate this. This is a really big step
Ken Case: To be clear, this is not Apple preventing any of these things from happening. It's Apple ensuring that it's disclosed to the people being tracked, or the people who are using the software that it is happening and letting them make the informed choice rather than just ignorantly assuming that, "Oh, if I'm playing this game, you're not tracking other information about me at the same time."
Cindy Cohn: Yeah. It's one of the things that's been going on in the surveillance business model space a lot where, when I sit on panels with people from the ad industry or from Google, they'll say, "Oh, people really understand the trade offs, and they have made a choice that they don't mind being tracked in exchange for the business model working." And I'm like, "Well, great. If that's really the case, why don't you give them the choice? Because if you're right and everybody loves it, then everybody's just going to choose it." Like a fair choice, not a, "You don't get this tool unless you choose it." That's not actually a fair choice.
Cindy Cohn: So Apple is switching this situation where you're empowered to make the choice. And I that it's really great that for a company like Omni, this just isn't that big a shift. I mean, I'm sure that there's some work that involved in it, but the business model wasn't about surveillance, and so it's not that big a deal to lose it.
Cindy Cohn: But I think that for the other apps you're using, I really want people to take the chance to exercise their voice so that we can put the light into what I think is this phony story that everybody loves being tracked. Having that shoe ad follow you all around the internet is just what we all got online for.
Andrew J. Mason: You know, that reminds me of a quote I read in a recent blog post as well. Apple's director of global privacy saying something to the effect of, "Advertising that respects user's privacy is not only possible, it was the standard before the internet was created as well. And now this practice of unfettered data collection has become normal in people's minds."
Andrew J. Mason: I'd love for both of you to answer for those of us that are over 30, do you think maybe we forgotten what privacy feels like?
Cindy Cohn: I think so. I think that the industry has done a very good job making people think that that's the only way to make money on the internet. And they've also done a pretty good job of convincing people that behavioral advertising is significantly better than things like contextual advertisement.
Cindy Cohn: So what I mean by contextual advertisement is, you go to a search engine, you type in you're looking for shoes, and that gives you ad for shoes. That's not the kind of advertising we're worried about. The context you're in and placing an ad that's consistent with that is fine. It's an earlier business model. So it's not about ads per se. It's about all of the surveillance and tracking that goes into this dream of marginally better ad returns if they know everything about you and can make inferences about you.
Cindy Cohn: And there's some interesting research now, some from a guy named Tim Wong, that the marginal difference here isn't that great. And most of the marginal difference goes to Google. It doesn't go to the publisher. So in a certain extent, I feel like we're trading in our privacy and our sense of not being watched, which I think is a core value, for pennies. It's not a good trade off.
Ken Case: Yeah. So for years we have struggled about where is the right place to spend advertising dollars. But anything that is based on tracking users and trying to advertise to users rather than tracking interest in advertising in the context of those interests, has always just felt kind of dirty to us. Right? I feel totally comfortable with advertising Mac software on a Mac forum or a Mac related website. You know, you're here because you're interested in the Mac and now. We're kind of supporting that, and help support that community that you're there visiting specifically because you're there to do that.
Ken Case: But if you then go to some random newspaper, completely unrelated, you're reading about politics and you see an advertisement for our Mac software and it feels sort of out of context and weird to me, I'm not sure how effective it is in the first place, but it certainly just doesn't feel like the right way advertising ought to be going. But it is the way it has gone through for a while now.
Andrew J. Mason: I'd love to talk about this concept of utilizing user data to feed back to users, what advertisers think that users might need. And I know that efficiency is a good thing. I know that optimizing algorithms ... there's so many concepts where it like does this take away serendipity? Is over optimization a good thing? I think that efficiency is fantastic, but when you have that shoe ad following you around everywhere, something does feel artificial and a tiny bit creepy about it. What's lost, what's gained?
Cindy Cohn: I think that the first question is whether it's helpful or not. And I think the jury's out and I think that there's a lot of evidence around the kind of behavioral advertising models being discriminatory, for instance. So, if you're not in the majority white culture, it's not really clear that this stuff works very well. Right?
Cindy Cohn: I don't want to just pick on Google, but I'm happy to pick on them a little, we know that ads and Facebook, that if you're looking for a job as a CEO and they know you're a woman, you're not going to get as many CEO ads because their algorithm has decided that you're not appropriate for a CEO role based upon your gender.
Cindy Cohn: They didn't mean to do that. Like, I don't think anybody programmed it that way, but if you're going to build profiles of people based on huge amounts of data, and you've got a society that is discriminatory in the way it acts, you're going to get discriminatory inferences.
Cindy Cohn: So the prices that people get charged, the ads that they see, and all of this stuff, there's a tremendous problem around discrimination and treating marginalized groups in this. So on the first part of this, the idea that this is slightly better, I guess the question is for whom is it slightly better? Because I think for a whole bunch of our society, it's actually not better in terms of more relevant ads, except building on top of a society that has problems. And that those societies not only get replicated, they get increased through machine-learning techniques that are the basis of a lot of this advertising.
Cindy Cohn: So the first thing is I would push back a little bit on the better question, because I think it's really fraught and you have to open it up. But more importantly, I think that the thing that Apple is doing now, isn't saying you can't get those ads, it's saying that you choose. And I think empowering users is the ultimate question about whether this is a good idea or not.
Cindy Cohn: And for too long, as I said, I think there's been a bit of a lie that the industry has told itself that everybody really likes what they're doing. And I think that the answer to that is, well, let's give people real choices, let them opt in, not opt out, make them give permission, make sure that you're giving them real choices.
Cindy Cohn: Everybody clicks, I agree on the terms of service, but they don't click, I agree, because they actually agree, right? The take it or leave it nature of a lot of these contracts just leaves people feeling powerless. So I think Apple is taking a step towards empowering users. And, if it turns out everybody loves behavioral advertising, maybe we'll see.
Cindy Cohn: But I think that that's not true. I think most people don't know they have the choice and Apple's going to start to give them a choice. And I think once you give people a choice, you're going to have a much bigger mix. Some people may want this tracking, but I think a lot of people are not going to want this tracking.
Andrew J. Mason: Talk to me about the tech user that is just resigned to, "You know what, the toothpaste is going to get out of the tube, my data's is out there anyway, we should all just be okay with big tech using our data." Is that an okay thing? Because up until now, you can turn off privacy options, but it might be buried 14 screens deep. They might change it every single update or turn it back on every single update. So something that was default no privacy, unless you explicitly turned it on, is now flipped with Apple's operating system where it's default privacy, unless you say you don't mind being tracked.
Andrew J. Mason: How do you reframe that argument for somebody that says, "You know what? We should just all be okay with tech using our data"?
Cindy Cohn: I think a couple of things, one is the fact that it's hard doesn't mean it shouldn't be done. As society, we've taken on harder problems than this. We've stopped wars. Right? I just think that the idea that we're powerless about our privacy as a society, is really a kind of nihilism and defeatism that isn't actually how people feel.
Cindy Cohn: Every time the Pew research does a survey of people's views on privacy, people really want to protect their privacy. They just don't think that they can, or they don't know how. So, I think that's the first thing is that I think that's a resigned view, not an actual empowered view.
Cindy Cohn: But the second thing is I think that privacy is tricky in that sometimes people think about it in their own individual case. And I don't think privacy, certainly individual privacy matters, as I said, people shut the door in the bathroom, but I think that it's a societal value as well. That's why it's in the international human rights treaties. That's why there's a version of privacy in our constitution, and certainly in our state constitutions, California and others, is because it's a community value. And by that what I mean is that I think it's a mistake to only think about your own personal experience around privacy instead of looking at societal needs.
Cindy Cohn: Just like we don't think freedom of speech only matters if you want to be the person who's standing on Speakers Corner saying things. We recognize that giving other people the right to speak and having a society where people can speak freely, makes for a better society and including our ability to change our government. If we don't like how our government is.
Cindy Cohn: Privacy works the same way. Even if you don't need privacy, society as a whole is better off if we all have privacy. And again, I think that especially matters if you think about marginalized groups, or groups who are trying to change the government. We work with people all around the world who are under authoritarian regimes, having the ability to organize privately without the government, seeing it as the key between creating a democracy and living under authoritarian rule.
Cindy Cohn: There was just a story recently in the New York times about [inaudible 00:19:36] in that regard where the opposition is getting squashed because the government can read everybody's communication. So that's just a very extreme example, but all across the board, privacy is something we care about both because it helps us individually, but also because it helps us as a society.
Andrew J. Mason: So beautifully stated, I would love it if you would just even flip that and let's look to the future a couple of years and say that EFF's efforts have been wildly successful. What is the tech landscape look like? What's possible? What are we doing? How is this all framed when people's human rights, privacy rights translate well into the online space?
Cindy Cohn: Well, I think the first thing is that we would stop having what I call two-faced apps, right? Where the app pretends to be something that you can play a game on, but it's really tracking you and feeding it in. That the business models are clear and we've decided on the ones that we like. And I think that getting rid of what I think of as the two-faced nature of apps and tools that we use, is something that would happen.
Cindy Cohn: So when you started to use a tool, you'd know what the deal is. And if you didn't want that to be the deal, you can opt out of that deal. I think we would have baseline privacy that was protected. When you ask people if they think their privacy is protected, you find out that most people think that the law protects them far more than it does. Like, "Oh, well, they couldn't do that. That's illegal." And I'm sitting here with my lawyer hat on going, "Well, actually not only can they do that, but they do that all the time."
Cindy Cohn: So I think that we do need to have a baseline privacy law that sets a standard for all the people who are building tools so that you as somebody who's got ethical ideas about how to build tools, aren't competing with somebody who has unethical ideas about tools and are just lying about that.
Cindy Cohn: So I think that we have a baseline privacy protection and the ability to have accountability for it. Ultimately you as a user, you have tools that you understand how they work, you understand what the business model is, and they basically are serving you. It's not that hard. Again, I think Omni is a pretty good example of a tool that, at least as far as I know, was always pretty straightforward about how it worked.
Cindy Cohn: So there are other ones out there if you look for them. I think that it's not that hard to get to a space again, and you're right, that we had this space in the '90s. Jane Horvath's quote from Apple is, is exactly right, where we get back to a space where we just have a straightforward digital experience rather than a kind of lying one.
Ken Case: Yeah, for me, the idea was kind of returning to that world of the promise of the internet that I felt like we were seeing at the end of the '80s and the early '90s. When we built our Omni web, web browser in '94, right? Web was not yet monetized. It was very early days, and most people were simply sharing information freely through this medium and creating value for themselves and each other. They weren't trying to track each other to create that value. They were creating value, sharing it with each other. And then we were able to build on each other's work and make things better and better.
Ken Case: And within a few short years, we started to see these darker patterns take hold, where folks were starting to track people's browsing history without their knowledge and consent as they were going from one site to another.
Ken Case: And that's why as early as '96, we started building features into our web browser saying, "Okay, well, here are some cookie controls to let you block third-party cookies, and to block that tracking. Or here are some preferences for whether you want to ever load an image from a third-party site, so you can avoid those tracking pixels and things like that."
Ken Case: And of course we were saddened that that was not the way that other web browsers went for several decades. I'm glad to see at this point that we're starting to see a trend of Safari and other browsers building in more of these privacy features.
Cindy Cohn: Yeah. I think that that's a really good point and I want to lift it up a little bit more. I think that one of the pieces of my future world is a world of much more interoperability. We've come to a world where, and I referenced this a little earlier, where we've got three or four big companies that have platforms that everybody has to negotiate the rules on their platforms, as opposed to a range of choices.
Cindy Cohn: And here again, Apple's using its platform power to do something good, which I support, but I still think the world would be better off if we didn't have these giants that you had to negotiate with to run something. And one of the ways I think we get there and can hit on it is interoperability, right? Where people build a tool and somebody else gets to build off of that tool and do something cooler. And then somebody else says, "Oh, wait, I need this feature. I need to add it on too."
Cindy Cohn: And we have really lost that ethos in the big tech tools that a lot of us rely on. I mean the open source community still works that way, it's vibrant, it gives us lots of tools. I don't mean to belittle it, but that's not how Facebook works. And that's not how most of Google's tools work at least for users or ... unless you're deep in their thrall about what rules you get to follow.
Cindy Cohn: And so interoperability is a really key core of the kinds of building of getting us to that future world and what that future world would have. And it's kind of back to the future, right? There're some values from the early internet that I think are really important, and I think that we can, if we start with those, we can build an even better future and including the ability for people to make money.
Cindy Cohn: I also just wanted to hit on cookies for a second, because I think Ken's right, that the introduction of cookies was really one of the worst pieces of all of this and the way that cookies have been used as part of the business model. I mean cookies do stuff that's fine too, like help us with making sure it's the right translation, right version of a website and things like that.
Cindy Cohn: But cookies are slowly going away, and Google has just announced a bunch of changes to try to make cookies go away. But the devil's in the details because we think that the particular way Google is going about it is going to cement their role and reintroduce some of the worst tracking kinds of modalities to the new world.
Cindy Cohn: So just because somebody agrees with you that something is bad, doesn't mean the thing that they're introducing next is good. You have to look again, and EFF just came out very strongly against something called FLoC, which is one of Google's ways they're trying to get rid of cookies, which we think cookies should go away, but introduce something that's going to cement their position.
Cindy Cohn: So you have to watch, and this is what EFF does, we're 30 years in, we're watchdogs, we pay attention to this where, if you donate to EFF, what you're supporting is people who are really watching, not just for stopping the bad, but making sure the thing that replaces it is actually better.
Andrew J. Mason: Cindy, I'd love your take on this final question that's kind of geared towards, how do we empower users? This whole concept of, and I'm putting air quotes here "by default" means that a decision is getting made sometimes without users knowing that that decision's being made for them. And that can result in data being handled in a way or synthesized in a way, or served to a people group or, not served to a people group based on things that are set in an app that a user might not ultimately even know about.
Andrew J. Mason: How do you frame the conversation where there's this line between, "I trust a company because I can't check everything out," and, "I need to educate myself and be armed with other sources of information in order to dilute any data mismanagement." Or is that even a way to frame the conversation?
Cindy Cohn: I think that we need both to make smart decisions for ourselves. And Apple's giving us a new decision, which I think is really important, but we also need to exercise our power correctly. But I think we also need to think a little more societally. Some of what we need here are laws, clear privacy, protecting laws. You shouldn't have to ... When you buy a car, you don't then have to go an look online to find the right kind of brakes and evaluate them and find your way through to try to figure out what are the right brakes. Your car comes with brakes.
Cindy Cohn: Technical tools need to be the same way in that they can't, they shouldn't outsource all of the hard technical thinking about the right choices to you, and then leave you with very little tools to do it.
Cindy Cohn: We need a baseline privacy protective law that then people will build their tools around so that you can still make ... you need to be empowered and you still make some choices, but we can't put you in a world where you're supposed to be able to figure out how the third-party tracking ad networks work, and whether that pixel is a good idea or not, that's just not fair.
Ken Case: There are so many subtle decisions that happen in the technology world. It's hard for people who are deep in that technology to sometimes make decisions over them. I don't know whether we always want to be presenting everything to customers, but we certainly do want to give them level of control of, "Here are my priorities. And here are the sorts of things that I care about," or at least present the information that lets them make those choices around that.
Ken Case: Now, I don't know that where we are right now with this level of disclosure is really giving users quite enough agency yet. It gives them a little bit more agency. You can see that some of these things are being tracked. And then if you want to use the service, well you kind of have to, and say, "Well, all my friends are on Facebook and I'm going to be on Facebook. And so I guess I'm consenting to this," whether or not it happens.
Ken Case: But at least it's a step in the right direction, and I think it opens the door for Facebook to maybe do some self reflection about how much of this do we really need so that they're not putting off some percentage of their user base.
Cindy Cohn: Yeah, I think that's right. We really do need to get to the place where choice means real choice, not shrugging your shoulders and saying, "I guess I got to put up with this because I want to talk to grandma." I think interoperability is the key, right? I mean, this is how the internet used to work, right? If you were sending an email, you didn't both have to be on the same service.
Cindy Cohn: In fact, we started ... there were services like early days of AOL or CompuServe ... this is old school stuff, I'm a gray haired lady, where you did have to be on the same network with people in order to be able to email with them. And then that business model failed. The people wanted to be able to talk to everybody regardless of what service that they were using.
Cindy Cohn: You know, similarly with the phones, you don't have to be on the same phone network with somebody to be able to call them. But yet everybody has to be a Facebook member in order to be able to talk to people on Facebook. So we know how to build distributed systems that inter-operate. In fact, these closed ... we call them Walled Gardens, these closed platforms are relatively new.
Cindy Cohn: So we need to start putting some pressure, and some of it may be legislative pressure on companies to open back up again so that it's not an all or nothing thing with Facebook, but maybe you could be on a Mastodon instance, to point out one of the distributed networks, and still be able to talk to grandma on Facebook.
Cindy Cohn: EFF just published a white paper called Privacy Without Monopoly, where we kind of went through some of these issues around interoperability, specifically around privacy because there's some tricky privacy issues there, to try to point out a way forward that would work. And it's not the only way, but we felt like it was important to get out there with some ideas about what are the things that are getting in the way right now. And some of them are legal, and some of them are cultural, and some of them are business to creating a much more interoperable world where again, you having a choice, means a lot more than it does right now.
Andrew J. Mason: Cindy and Ken, I have loved this conversation, and thank you, Cindy, for that metaphor, that is so beautiful, this idea of you don't have to pick out your car's brakes. So some things can just be solved at the systems level. I think that's a fantastic way to look at it. Cindy, if people are more interested in connecting with you and your work with the EFF, where can they do that?
Cindy Cohn: Well, eff.org is our website. Our blog is called Deeplinks Blog. That's where you'll find most of what we're active on. But it's a big website. As I've said, we've been around 30 years. If there's a piece of the internet and your rights that you care about, chances are we have a piece of our website talking about that.
Cindy Cohn: And of course we work for tips, right? We're a nonprofit. So we have around 40,000 dues-paying members that provide the bulk of the money that we use to hire the about 100 or so people who work on these issues. And so if you think that this is important and it's important to have a watchdog, I sure hope you will support us.
Ken Case: Well, thank you, Cindy. I really appreciate all of the work that your team has done through the years. I've been proud to support. I know others at Omni support the EFF as well. And to those of you listening, I hope that you've been inspired by this conversation and will join us.
Cindy Cohn: Oh, thank you so much. And we have cool swag too. So lots of good t-shirts and hats. [inaudible 00:32:30]
Andrew J. Mason: Grab a hat. Thank you so much for your time with us today, Ken and Cindy.
Cindy Cohn: Thank you.
Andrew J. Mason: And thank all of you for listening today. Hey, we're curious, are you enjoying the shows? Are you enjoying learning how people are getting things done utilizing Omni software and products? Drop us a line at the Omni show on Twitter. We'd love to hear from you there. You can also find out everything that's happening with the Omni Group at omnigroup.com/blog