Hey CodeNewbies, I wanted to share another show that I host called DevNews where we cover the latest in the world of tech.

We’ve covered some really awesome topics on the show and have had some amazing guests, like Dan Abranov of React, and core team developers at Rust and TypeScript.

I wanted to share this episode with you because we talk about some really interesting news, as well as have a great conversation with a journalist at Motherboard about about the U.S. military buying location data from seemingly innocuous apps, including a craigslist app, as well as a Muslim dating app and muslim prayer app.

So take a listen, rate, and subscribe!

[00:00:10] SY: Welcome to DevNews, the news show for developers by developers, where we cover the latest in the world of tech. I’m Saron Yitbarek, Founder of Disco.

[00:00:19] JP: And I’m Josh Puetz, Principal Engineer at Forem.

[00:00:22] SY: This week, we’re talking about how Apple server problems cause slowdowns and crashes for all apps, launching all versions of Mac iOS. The rise of school districts, being the targets of ransomware attacks and GitHub reinstating YouTube DL, a program to download videos from YouTube and other video sites after a Digital Millennium Copyright Act take down.

[00:00:43] JP: And then we’ll chat with Senior Staff Writer at Motherboard, Joseph Cox, his piece entitled, “How the US Military Buys Location Data from Ordinary Apps,” shines a spotlight on the location data industry and who is being targeted.

[00:00:54] JC: Even people in that supply chain didn’t necessarily know where the data was ending up, especially some of the app developers themselves.

[00:01:02] JP: If you’re like us and used a macOS computer for your development work, last week, you may have noticed something strange going on. On Thursday, November 12th, Apple released macOS 11 Big Sur. And around that same time, users started reporting problems launching applications on existing versions of macOS. Applications would take much longer than usual to launch or fail to launch at all. Personally, I restarted my MacBook several times and I was starting to think that my hard drive was going bad. What was actually happening was a process named “trustd” was attempting to contact a server at Apple. Now this process is responsible for confirming that an application security certificate is valid before it launches it. This process is supposed to fail softly if you have your internet turned off or if it can’t reach the server. However, request to the server weren’t just failing. They were taking a really long time to time out and that caused trustd to spin on users’ computers. Many users were surprised to find out Apple has this level of control over applications on their computer and privacy experts immediately started sounding alarms. Apple responded with an update to a privacy explainer on their support site that promises updates in the next year to make this process more resilient to server failure, as well as letting users opt out of the entire system. This episode has many users questioning what it means to own their computer when a company has this global have control over it.

[00:02:18] SY: Yeah. That’s what I thought was really interesting because I guess there’s a couple of things that’s interesting. The first is the fact that Apple has been doing this and it seems like no one really knew that this was happening, which I feel like Apple is like the one big tech company that’s all about privacy and being transparent and all that. It doesn’t feel nefarious or anything like that, but it feels off brand that this is happening.

[00:02:45] JP: Apple has pointed out that this whole process is something they call “gatekeeper”, which is supposed to protect your computer from malicious apps and basically give Apple a way to shut an app down should they find out it has a virus included with it or it’s doing something malicious. They’ve actually done that a handful of times over the past couple of years, but I think most users haven’t really focused on what could go wrong should these services go down.

[00:03:11] SY: Yeah. That makes sense. But I like that you can opt out of it now. That is very on brand for Apple. I appreciate that.

[00:03:17] JP: They said they would give you the ability to opt out of it in a future update. Right now developers can edit some command line system preferences to turn some of these things off, but there’s not a great way to completely opt out of the system yet.

[00:03:32] SY: Right. Right.

[00:03:32] JP: I think this brings up a question of why does Apple feel like they need this level of control? I mean, it’s very nice of them to want to protect our computers from viruses and malicious programs. But at the end of the day, as a user, I made the decision to download something and I’m making the decision to run it. So should Apple have this level of power?

[00:03:56] SY: Yeah. I mean, it’s interesting because as a developer, my instinct is to say no, and to say that I know what I’m doing, I know what I’m downloading, but I don’t think that’s true for most people.

[00:04:08] JP: That’s true.

[00:04:08] SY: You know what I mean? If we’re being very, very honest, I think that for most people, I don’t think their guard is necessarily up the way that it generally is for most developers. I don’t think they’re kind of thinking through, “Oh, this website isn’t necessarily reputable.” They’re not even maybe aware of how bad a virus can get. You know what I mean? I don’t know if most people really go through the process they should be going through to protect themselves. And if that’s the case, Apple being the walled garden and Apple being very, very intentional about creating the best experience and making sure that you’re always safe and you’re always guarded, I’m not necessarily surprised that they’re doing this and it definitely feels like it comes from a good place at least.

[00:04:51] JP: I think this procedure definitely happens on users’ phones, whether they’re on iOS or Android. And I think as users, we’ve just internalized that process.

[00:04:59] SY: That’s true.

[00:04:59] JP: And this happening on a desktop or laptop computer, I think what was very surprising to people.

[00:05:04] SY: Yeah. Because there’s something about the app experience where I never feel like I own my app. I don’t know what exactly the differences or I can’t really explain it, but when I download an app, I feel like, yeah, it’s something I have access to. It’s not something that I have. Whereas when I think about things I download on my computer, I definitely feel a sense of ownership on my laptop that I don’t feel on my phone.

[00:05:30] JP: I do as well. Yes.

[00:05:31] SY: Yeah. That’s kind of how I feel about it. So it definitely wouldn’t surprise me. It wouldn’t catch me off guard. I’d feel no type of way if they did this on the phone and they do, do it on the phone. But when it’s on the computer, it definitely felt funny.

[00:05:42] JP: Agreed.

[00:05:43] SY: So there was this really great piece in the Wall Street Journal about how school districts are being targeted by ransomware hackers at much greater numbers than ever before. One of the reasons for this is most likely the ever increasing digital nature of school ever since the pandemic and quarantine. As a matter of fact, ransomware attacks increased in general, according to the US Treasury Department. So the way ransomware works is that once a hacker gets into your system, they will essentially lock it down and they’ll encrypt your data, so you can’t get into it. And many times they will demand that you pay them some amount of money, usually in Bitcoin, or else they’ll destroy all the data and will publish the sensitive information on your device, on your computer. So what makes this wave of ransomware attacks even worse is that not only do schools hold a ton of important information in their systems like addresses, Social Security numbers, and other student info, schools are woefully unprepared to battle hacks like this because on the whole, they are underfunded and technologically archaic. Sorry, schools. These are the facts. The districts are also typically left on their own to deal with these attacks with no governmental aid. So it’s a pretty tough spot for schools to be in.

[00:06:56] JP: This was really both surprising to me and completely makes sense when I think about it.

[00:07:00] SY: Totally makes sense. Yeah.

[00:07:02] JP: I have a daughter who is in seventh grade and her school district is doing a lot of remote instruction, like many others, and like a lot of other school districts and from other parents I’ve heard, these school districts are not cybersecurity experts. They’re not even remote work experts. These teachers and school districts are trying to teach kids online with very little preparation and very little guidance and they’re really just holding it together just from a day-to-day instruction standpoint. You get the sense that they’re not thinking about cybersecurity at all.

[00:07:33] SY: Absolutely. And I feel bad for the schools because whether or not there was a pandemic, I’m sure these ransomware attacks are tough to deal with, period, but now they have to deal with that and the pandemic. There’s so much going on with schools. Like you said, so little guidance and just so much pressure to get things right with very little help. And it’s not just the fact that they’re underfunded or their tech is old. It’s the fact that we’ve never been in this situation before as the world. We’ve never been in a situation where there’s a pandemic and all of a sudden we have to do online learning. This is just such a new world that no one has faced. And then taking advantage of that by hackers makes a ton of sense, but it’s also just so unfortunate to have that added pressure and burden on these schools.

[00:08:19] JP: I find it really interesting how they talk to some of the school districts that were hit by these ransomware attacks and the school districts were basically doing the calculus whether they should pay or not, how bad the exposure would be, and then wondering about like, “What do we have to disclose?” Public school districts have to publish their financials and it’s getting pretty obvious if you see a large sum coming out of the technology budget.

[00:08:41] SY: In Bitcoin.

[00:08:42] JP: In Bitcoin, exactly. “What was this $50,000 in Bitcoin last month for?” But on the other hand, you can understand that school districts probably don’t want to advertise, “Yes, we paid off this ransom attack.” I wonder if it would almost serve as a beacon to other hackers to target those districts.

[00:09:00] SY: That’s a really good point. Yeah. That’s a really good point. And I’m also wondering how this would affect their kind of standing in the school district. Are they going to be punished for “making themselves a target”? Is there kind of any repercussion to being targeted? You know what I mean?

[00:09:18] JP: Right.

[00:09:19] SY: Well, how are they viewed in the eyes of the district and in the eyes of the people who will be giving them funding next year? If a school has had any of the spending a hundred thousand dollars on this, is that hundred thousand going to be, I don’t know, deducted from next year’s budget? You know what I mean? How will their funding be affected by having to provide funding for things like this? I’m wondering how that’ll affect them as well.

[00:09:40] JP: The most scary part of this to me is that the data we’re talking about being potentially leaked and ransomed is addresses phone numbers, Social Security numbers of children.

[00:09:50] SY: Right.

[00:09:51] JP: If it was you or I, if our personal information was leaked, I might see some weird charges. I might see some activity on my credit report, but for kids, they might not notice their information has been leaked until they try to apply for their first credit card potentially.

[00:10:04] SY: That’s a very good point. very good point. Very, very good point. Yeah. And that was probably the thing that stood out to me the most about this article is when I first heard like ransomware at schools, I was thinking, “Okay, you’re going to publish people’s report cards?” You know what I mean? Like, “Who cares? What’s the big deal?” And it just didn’t really occur to me that really we’re talking about sensitive information of children, how schools have a lot of that detail that I just don’t think about as a school being like a data center. You know?

[00:10:28] JP: Right.

[00:10:29] SY: I just didn’t think of them that way, but yeah, they’ve got a lot information. Absolutely.

[MUSIC BREAK]

[AD]

[00:10:51] JL: Triplebyte is a job search platform that allows you to take a coding quiz for a variety of tracks to identify your strengths, improve your skills, and help you find your dream job. The service is free for engineers and Triplebyte will even cover your flights and hotels for final interviews.

[00:11:05] SY: Vonage is a cloud communications platform that allows developers to integrate voice, video, and messaging into their applications using their communication APIs. Whether you’re wanting to build video calls into your app, create a Facebook bot or build applications on top of programmable phone numbers, you’ll have all the tools you need. Formally known as Nexmo, Vonage has you covered for all API communications projects. Sign up for an account at nexmo.dev/devnews2 and use promo code DEVNEWS2 for 10 euros of free credit. That’s D-E-V-N-E-W-S, in all caps, and the number 2, for 10 euros of free credit.

[AD END]

[00:11:52] JP: Earlier this month, the Digital Millennium Copyright Act forced GitHub to take on a project called “YouTube DL”, which is a program that allows users to rip videos off of YouTube and other internet sites. In a GitHub blog post this week, the company said, “As a platform, we must comply with laws, even ones we don’t think are fair for developers. As we’ve seen, this can lead to some situations where GitHub is required to remove code.” But also in that post, they said that after receiving a letter from the Electronic Frontier Foundation, which represents maintainers of the YouTube DL project, they had enough “new information” that YouTube DL doesn’t violate the Copyright Act or the DMCA to feel comfortable enough to reinstate the YouTube DL project. Not only that they decided to reinstate it, but they’re also donating $1 million to a developer defense fund to help protect developers who might get future DMCA takedown claims.

[00:12:44] SY: That’s great.

[00:12:45] JP: They also say they’re overhauling their 1201 claim review process and then stating several steps before any take-down claim is processed. Some of these steps include each claim being reviewed by technical experts to ensure that the project actually circumvents a technical protection measure as described in the claim, as well as being reviewed by legal experts to ensure unwarranted claims or claims that extend beyond the boundaries of the DMCA are rejected.

[00:13:09] SY: Wow! That’s great of GitHub, right? Don’t you think?

[00:13:11] JP: It’s really great of GitHub. Yeah, I thought it was really interesting. This is a legal problem with the DMCA. They have to respond to the take-down notice. There’s no option for GitHub to argue it.

[00:13:23] SY: Interesting. Interesting. So who’s the one that needs to do the responding?

[00:13:27] JP: From my understanding, the person or the project that the take-down claim was served again. So in this case, YouTube DL has to respond to say, “No, we’re good. We are arguing this case.” But as the host of the code, GitHub basically has to take it down. Otherwise, the complainer, in this case, the RIAA, the Recording Industry Artists of America, can go above GitHub. So in this case, it would be like GitHub’s network provider to take down the information.

[00:13:59] SY: Oh, interesting, interesting, interesting.

[00:14:01] JP: This whole thing brings up to me like really, the DMCA is even still a thing? The last time I heard about DMCA take down notices, it was when the same group, the RIAA, was trying to remove copyrighted music. The MPAA, the Motion Picture Association of America, has also used this law to take down pirated movies in the past. It seemed really weird for them to go after YouTube DL and they were arguing that YouTube DL somehow allowed people to circumvent encryptions or protections on YouTube videos.

[00:14:36] SY: Is YouTube DL so popular that it would get the attention of the RIAA? I mean, I guess so, but it feels like a developer tool. You know what I mean? It feels like something that like we know, but I guess they know about it too.

[00:14:50] JP: From what I’ve heard read, it’s really popular among journalists, researchers, all sorts of folks that want to archive video, want to view it offline, want to save it for future usage or reference.

[00:15:02] SY: Right. Right. Well, good for GitHub. I think it’s really great to have such a huge organization, such a powerful organization, basically stand up for us and give us an opportunity to defend ourselves. Because especially as open source contributors, we don’t have a lot of money. You know?

[00:15:19] JP: Right. I don’t have a legal defense fund in my back pocket.

[00:15:22] SY: Right. Exactly. So this is great that we’re finally able to like protect ourselves and stand up for ourselves. And the fact that the EFF is getting involved too is, again, really great, a huge organization, a really well-respected organization, and having those to kind of support open source contributors is really great for that community.

[00:15:42] JP: Yeah. Outside of actually like overhauling this law or changing this law, I think this is the best we can hope for right now.

[00:15:49] SY: Yeah. Interested to see how it is going to be used in the future and then hopefully we will have some good defenses and some people, being able to have their apps distributed and not have to worry about these DMCA takedowns. So coming up next, we speak with Senior Staff Writer at Motherboard, Joseph Cox, about a piece he wrote about the US military buying location data from seemingly innocuous apps, including a Craigslist app, as well as a Muslim dating app and Muslim prayer app after this.

[MUSIC BREAK]

[AD]

[00:16:37] JL: Join over 200,000 top engineers who have used Triplebyte to find their dream job. Triplebyte shows your potential based on proven technical skills by having you take a coding quiz from a variety of tracks and helping you identify high growth opportunities and getting your foot in the door with their recommendation. It’s also free for engineers, since companies pay Triplebyte to make their hiring process more efficient.

[00:17:00] SY: Vonage is a cloud communications platform that allows developers to integrate voice, video, and messaging into their applications using their communication APIs. Whether you’re wanting to build video calls into your app, create a Facebook bot or build applications on top of programmable phone numbers, you’ll have all the tools you need. Formally known as Nexmo, Vonage has you covered for all API communications projects. Sign up for an account at nexmo.dev/devnews2 and use promo code DEVNEWS2 for 10 euros of free credit. That’s D-E-V-N-E-W-S, in all caps, and the number 2, for 10 euros of free credit.

[AD END]

[00:17:46] SY: Joining us is Joseph Cox, Senior Staff Writer at Motherboard. Thank you so much for being here.

[00:17:51] JC: Thank you. Thank you for having me.

[00:17:53] SY: So tell us a bit about your journalism background.

[00:17:56] JC: So I started covering sorts of the dark web and digital crime in around 2013, 2014, before moving into privacy, hacking, cybersecurity, that sort of thing. And more recently, that’s been location data, specifically location data harvested from ordinary smartphone apps and sort of that ecosystem of data transfer between apps, brokers, and marketing firms or whatever it may be, but especially more recently government and the clients as well. There are definitely ending up as like end users of this data.

[00:18:31] JP: SO you recently wrote a piece on VICE entitled, “How the US Military Buys Location Data from Ordinary Apps.” What is this piece about?

[00:18:39] JC: So this piece is about how a series of apps are sending data to a broker called X-Mode. X-Mode pays app developers to put their SDK into their app. I mean, they were very public facing company. They’re relatively well-known in this space. But what we found among, well, as well as the individual apps that were sending data to X-Mode, we also found that the company included defense contractors as its clients and ultimately some US military customers as well. That’s a combination of technical analysis of the app, network analysis, but then also speaking to sources in the industry, Senator Ron Wyden provided information as well and of course X-Mode itself, because it’s a very complex supply chain here. At least what I found was that even people in that supply chain didn’t necessarily know whether data was ending up, especially some of the app developers themselves.

[00:19:36] SY: So how did you start your investigation into this?

[00:19:39] JC: So I’ve been looking at this location data industry for a while, mostly based on sources, documents, that sort of thing. But I really want to identify specific apps in that supply chain in that ecosystem. So I started by finding the API end point for X-Mode. This was included in the recent report from the Australian government and various cybersecurity researchers had also just tweeted this end point. So I started searching through apps for that using some automated tools that I have access to. Once I had done that and I found some with that end point, I chose some more specific targets to do some static analysis on, just downloading the APK, opening up, trying to find out the context in which that endpoint was being used. But more importantly for me, I’m a journalist, not really a cybersecurity researcher, so I really wanted to see more the data transfer inaction. That was what was really important for our story. So that was just person in the middle interception between the app and wherever it was sending data and confirming that it was sending this information to X-Mode. So that was sort of the technical start of it. The other stuff is speaking to former employees of some of these companies that are providing location data to the government, and obviously speaking to Senator Ron Wyden in his office who have been doing their own parallel investigation sort of into X-Mode and the location data industry writ large, really.

[00:21:06] JP: So tell us about what kind of apps the military was buying this granular user location data from.

[00:21:12] JC: So there are sort of two parts of it. I should say that, yeah, we did find these two parallel streams of data transfer to the US military. The first one, which I haven’t touched on yet, but is fairly straightforward, is a US special operations command, obviously a branch of the US military, they bought access to a product called “Locate X”, which is from a company called Babel Street. And they also just get information from a load of ordinary apps. My understanding is that it’s probably wherever apps, e-commerce, that sort of thing. But then when it comes to X-Mode and finding the specific apps, there was a step counter, sort of a fitness tracker. There was one called “Global Storms”, which obviously would let you see where storms are coming in across the world, which would of course use location data. But the most interesting ones we found, at least in this specific US military context, was of course a Muslim prayer app and a Muslim dating app. Clearly, the context there is that if the US military is buying data from a Muslim-focused app, that could be of particular interest to agencies. We don’t know what exactly the military is using it for, but we did find that Muslim Pro, this prayer and Quran app, was sending information to X-Mode. Muslim Pro has something like over 96 million users. It does seem to be pretty popular. And for that reason, that was the sort of the main app that I focused on.

[00:22:33] SY: And how did you find this out? How did you find out that the US military was buying all this data?

[00:22:38] JC: It was sort of a very complex puzzle with multiple different parts. I mean, for a while now, I’ve been trying to do a piece, which is, “Hey, this specific app, X, Y, Z, is sending data to this specific government agency,” we probably go about, I don’t know, two-thirds or three-quarters of the way here with that. But the key part for us was that X-Mode’s lawyers previously told the office of Senator Ron Wyden and they said that, “Yes, we sell data to US military contractors,” which goes to US military customers and that includes phones in the United States. I mean, that is not something that X-Mode is going to tell me as an investigative journalist. I appreciate that it’s not information that they probably want out, but they did tell the office of Senator Ron Wyden that, who in turn told me. We had all the technical analysis. We have this contract with special operations command. You can have all of that stuff, but that was sort of the final link that made you go, “Okay, now I can actually write this story,” because they kind of glued it all together.

[00:23:42] JP: So obviously you found a pattern in the type of data, the US government’s buying location data from, mainly it seems centered on Muslim audiences. And I’m curious, how has this kind of thing legal, especially when I imagine some of the data that they’re collecting and buying, you would normally need a warrant to get? Wouldn’t you?

[00:24:00] JC: Right. Yeah. There are plenty of legal issues that come up with this sort of thing. When you bring up warrants, of course, that is very much a law enforcement context. We’ve also covered how customs and border protection buys access to this sort of data, not the same supplier, but also location data. ICE is doing the same as well. And I believe other government agencies are doing this as well. Typically, as you mentioned, you’re going to need to have a warrant to access location data after we had the Supreme Court ruling about Carpenter and cellphone location information with that. We still haven’t seen how this sort of data would actually be figured out in the courts, but a general understanding or at least a good premise to go on would be yes, it needs to have a warrant. But in this context where we’re talking about now in the US military, the military doesn’t get warrants. There you go and they do their missions, and of course, predominantly almost exclusively, they’re going to be overseas. And in that sort of legal context, the military is not going to need to get a warrant, which for me, there’s law enforcement on one side and that is concerning when they’re doing stuff. They’re essentially buying their way through the Fourth Amendment, by buying access to this data, which may lead a legal process. But with the military, they don’t need that. For me, that was interesting because this was the first instance we’ve seen that the military force buying and supposedly using this data in some form.

[00:25:22] SY: So why would an app participate in this type of program? As a developer, why would I join this data collection program?

[00:25:30] JC: X-Mode does pay the app developers to embed its SDK. I believe it does it on a per user basis. So if you go on to X-Mode’s website, there’s some sort of calculator for figuring out how much you as a dev could get. I put in something like, “Oh, let’s pretend I have an app with 50,000 users in the United States,” which for X-Mode is probably going to be the most valuable sort of demographic of user than being a US company. 50,000 US users got something like $1,500 a month, which is not a lot, considering we’re talking about the data of where someone sleeps, where they go to work or in the context of some of these Muslim apps, of course, where they may worship as well. It’s exceptionally cheap. Of course, an app may have hundreds of thousands if not millions of users. So you can make a lot more money there. The Muslim dating app I looked at has something like a hundred thousand downloads in total. So they’re not going to have much, but Muslim Pro may have been able to make some money off of that.

[00:26:29] SY: So do you think that these developers who participate, are they aware of where this data is going and what it might possibly be used for?

[00:26:39] JC: The impression I got when I was speaking to some of the developers behind these apps, I mean, they do know that they’re working with X-Mode at least of course because there’ll be a formal communication channel there and a formal contract there. X-Mode probably reached out at some point. But when it comes to where the data actually ends up, at least a few of the devs I spoke to, they never heard of X-Mode selling to military contractors. Me emailing them was the first time they’ve ever heard about it and they certainly hadn’t heard about the US military getting their hands on it. So it is a black box, this location data industry, even for the people who are inside it, especially at least some of the app developers.

[00:27:14] JP: So what would be your advice to developers and companies who may not want to be a part of something like this?

[00:27:19] JC: I mean as a journalist, it’s not really my place to say what people should and shouldn’t do. We will just get the information, verify it, and then people can make an informed decision. But that would be the only point I would say is that people should be able to make an informed decision, whether that you’re an app developer and you want to put an SDK into your project and then it sends data. It seems that you should be able to know what is happening with that data. Or if you were a user and you’re downloading Muslim Pro and the app doesn’t say in its privacy policy anything about X-Mode or there’s no disclosure about that, well, that doesn’t seem to be great either. So I couldn’t tell an app developer what they should or shouldn’t do. It is entirely up to them. But I do think they should be able to make an informed decision about it at least.

[00:28:00] SY: So I think the biggest concern with collecting this data is the collection part as much as what you do with that data and being able to potentially target certain groups unfairly and in nefarious ways. So as developers, as everyday citizens, is there anything we could do to stop this kind of transaction or AT least kind of protect ourselves, protect our users from the downside of gathering this type of data?

[00:28:28] JC: Yeah. I mean, I think as we’ve seen as soon as you put an SDK into your app or as soon as a user downloads an app and that data transfer after location information has been completed, neither the dev or the user really knows what’s going to end up happening with that data. So if you were trying to make concrete steps, and again, this is up to the individual, it’s not up to me as a journalist to say what you should or shouldn’t do. But if you were to do that, of course, if you’re a dev, you just try to get more information what is happening with that data, which of course could be difficult. But on X-Mode’s website, at least there was a list of some of the customers, which they’ve started to remove that’s been Ron Wyden’s office obviously written and investigated the issue, but there could be room for more investigation there if a dev wanted to find out what was happening with the data, or is a user, and this is more on a personal sort of privacy level. If you haven’t played that game app or use that weather app or whatever it may be that requested your location information, you haven’t used it in a few weeks, maybe you don’t need it on your phone. Maybe you don’t need that software running in the background because of course, these SDKs is going to vary from SDK obviously between them and it’s going to vary from app to app, but they’re not necessarily only going to be operating when the app itself is open. Of course, they’re there and they’re going to be potentially calling out to sending information, even when the user isn’t actively engaged with the app. So perhaps a user doesn’t need it installed there in the first place.

[00:29:53] JP: That’s a good point. You point out in your piece that many of these apps weren’t even informing users their data was being sold or who was being transferred to. So is there outside of deleting apps that you’re not using? Is there anything else end consumers can do to determine where their data is going?

[00:30:11] JC: I mean, that’s the horrible thing, but if an app has not put it in its privacy policy or made any sort of disclosure that this transfer is being made, ultimately an ordinary non-technical user is not going to find out. They would have to then do something that we did like the network analysis to see the data being transferred. What a user could do similar to not having the app installed would be just be very cautious about sort of the operating system level permissions that you hand out. So of course, when we installed Muslim Pro and the other apps, the Android or the iOS prompt said, “Hey, this app wants to use your location data. Do you want to grant it?” But that’s not the same as an app privacy policy saying, “Hey, yes, we’re now going to sell this data and we’re going to literally get monetary value for it.” I mean, of course, as you can probably tell from my accent, I’m British and European. So I’m very familiar with GDPR and that sort of thing. This would not be legal at least to my understanding under GDPR. You can’t just have an Android or an iOS operating system level disclosure and think that’s enough, especially when it’s not mentioned in the privacy policy either. So users can just maybe be a bit more conservative about the permissions they hand out as well because we don’t have the privacy legislation, at least in the United States, there to sort of protect it.

[00:31:29] SY: Is there anything else that you’d like to add that we may not have gotten to?

[00:31:32] JC: I think just the main point I kind of want to get across is that this is really more about informed consent. Beyond sort of the legal ramifications there, the legal structures in place, it really comes down to whether a user should or can make an informed decision about what is happening with their data.

[00:31:49] SY: Absolutely. Thank you so much for joining us.

[00:31:52] JC: Thank you. Appreciate it.

[00:32:03] SY: Thank you for listening to DevNews. This show is produced and mixed by Levi Sharpe. Editorial oversight by Peter Frank, Ben Halpern, and Jess Lee. Our theme music is by Dan Powell. If you have any questions or comments, dial into our Google Voice at +1 (929) 500-1513. Or email us at pod@dev.to. Please rate and subscribe to this show on Apple Podcasts.

Copyright © Dev Community Inc.