Is AI listening to your every word? Apple’s massive $95 million settlement over Siri’s alleged eavesdropping raises big questions about tech privacy — are we being spied on, and do settlements mean anything if Big Tech keeps cashing in? Meanwhile, Texas slaps Meta with a record-breaking $1.4 billion fine for facial recognition violations, but does it really stop the privacy invasion, or is it just another cost of doing business?
Plus, is telehealth getting a futuristic facelift or just a pricey gimmick? We dive into the rise of “hologram doctors” and whether AI-driven consultations are the future of medicine — or just a high-tech illusion. And finally, CNN joins the paywall craze, but will anyone actually pay for their news? Join us on TechWatch Radio as we break it all down!
[00:00:13] An incredible edition of Tech Watch Radio takes to the airwaves now. Ladies and gentlemen, I'm Sam Bushman, Jay Harrison with me. NetworkProvidersInc.com is the promoter of the show. We keep an eye on tech so you don't have to. And man, last show I didn't know if we had enough time to really work on the chat GPT thing. The phone line, it's kind of fascinating. I don't know what you mean by that show, Jay. It's not real.
[00:00:48] It's all a figment of our imagination, Sam. You're not really listening to it. We're not really here. Are you really here? No, no, not at all. Anyway, I found it fascinating. I didn't know if we had enough time to cover it. It looks like we do. We did. It was great. It did perfectly. But it is really cool. You got to admit, right? For a landline, somebody just has a flip phone or they want access to this, your grandmother or whoever, you just call an 800 number and ask it. You have a question about anything.
[00:01:17] Anything. What's the distance from here to Pluto? You can just call it. Put it on your speed dial. I think it's pretty cool. Burn it up. I wonder if it thinks Pluto is real. It's probably not a planet, right? Anymore, supposedly. Just kind of wondering. It's just, you know. Anyway, isn't it different engines, though? If you use that or the free chat GPT, you get a certain engine and then you get.
[00:01:37] Yeah, and I'm sure it's their cheaper engine. It's probably their 01 Tiny model or something like that or whatever, maybe even the older 3 model because, you know, they're offering it for free. Type in to chat GPT if you can, you know, hey, tell me about TechWatchRadio or whatever and see what it says if you use the better engine. Because I wonder if the better engine knows who we are. It's kind of fascinating. Right when ChatGPT came out, we asked ChatGPT to write an article in the spirit of Sam Bushman and stuff like that.
[00:02:07] And it wrote an incredible article and knew who I was and everything. So it's kind of interesting to me that, you know, it's so easy to find TechWatchRadio on the Internet. NPITechGuys.com is where you go to see the shows. All right, here's the answer from a – From the real – this is actually the 4.0 model and it says TechWatchRadio hosted by Sam Bushman and Jay Harrison is a comprehensive talk or technology talk show that delves into various aspects of computers and technology and exploring their impact on daily life.
[00:02:34] Host Sam Bushman, co-founder of TechWatchRadio, yada, yada. And that goes about Jay Harrison and probably pulling this from the website. But it even talks about some of the previous episodes and Roku and all. So it definitely knows. I think that's just like a simpler model. Or maybe it didn't understand your question a little bit. You know, I don't know. I don't know what the exact was. Yeah. Anyway, I found it very interesting to say the least. Again, these tools are incredible and they can be used for valuable things. But remember to take it a little bit with a grain of salt.
[00:03:03] Remember to realize who's in charge. You're in charge. Make sure that tech serves you. Tech does not own you. Yeah, it's just an advisor. And that's one thing people really need to keep in mind. It's not the gospel. You're not talking to, you know, an oracle or anything like that. It's just an advisor. You're just getting a second opinion. It's a virtual oracle, though, Jay. Maybe. All right. Hey, I guess Apple had to settle, Jay. What the heck's up with that? Yeah, this is kind of creepy and a little bit crazy, right?
[00:03:31] Apple has reached a settlement, $95 million settlement over a lawsuit accusing them of Siri of eavesdropping on consumers. They've agreed to pay it. And they said the company was infringing on users' privacy by using Siri, Apple's artificial intelligence, to eavesdrop on an individual with Apple devices. It's kind of creepy like that they would actually be doing that. And there must have been some culpability where they've decided that they're going to settle out on this thing.
[00:03:58] But it's a breach of trust, I think, with people. A lot of people look at Apple as not only the premier sort of technology, but also as like the safest walled garden ecosystem. Supposedly, iOS is the most secure operating system if you're doing banking and things like this. And this is a horrible breach of trust. It's secure except for from themselves. Yeah, exactly.
[00:04:25] And why they would jeopardize that is beyond me. I don't understand. Well, and what I don't understand, though, is this. Because the story lacks, in my opinion, detail that matters, Jay. It does because they settled. And so I think that may be part of the reason why they did, right? Because they just want to put it to bed. Did they not admit any wrongdoing, Jay? I don't know, right? All right. Well, here's the reason that I think it matters.
[00:04:49] Because, look, for that device, that name, Siri, to know what its name is called and know when to respond to you, it really has to listen all the time, Jay. Right. Now you could say, well, it should only listen to its own wake word. It shouldn't listen to everything else.
[00:05:08] And my response is, that's debatable because how much do you have to listen to to try to isolate the context to respond when really summoned versus not intentionally and all that kind of stuff. So there's some leeway there. The question becomes, are you storing the data longer than you need to to get that done? And what are you doing with the data now that you have it? Those are the questions that I don't believe are being answered here that really need to be discussed.
[00:05:33] And when you get a settlement, it means we're not going to bring that to fruition and get all accountability out on the table. We're just going to simply pay a fee and move along. I'm not comfortable with that at all, Jay. Apple has retained that they do not acknowledge any wrongdoing in the pending settlement with the lawyers set to review the terms on February 14th. Additionally, the $95 million settlement is only a small faction in comparison to the $705 billion Apple has made since September of 2014 when they say this originally occurred.
[00:06:02] So I think it's probably guilty or not. I think they're just like, hey, we just want to put this to bed. We don't even want this to be a problem. And if we did, we're not doing any more, whatever. You know, but we don't have any wrongdoing and we're just going to settle this out of court. So I predict they're going down the road of Google, which is we'll continue to spy on you, continue to gather your data, continue to violate your privacy, continue to abuse your rights, continue to. And we're going to pretend we don't do it every time.
[00:06:29] I mean, what we're going to do is we're going to make so much money on the front end that the back end payments don't even matter at Tinker's. Dang. And we're just going to go ahead and do that. And no harm, no foul. We're going to keep doing it because we're getting away with it. I hope that's not the case. Is that where we are? It may be. Actually, it may be in technology in general. That's where we are. But I hope that's not the case for their sake and their customers. Yeah. Anyway, Texas, I guess, is in the mix, too, Jay, a little bit with a battle on their own hands. Similar topic, right? Yeah.
[00:06:59] In what regard? I need a better lead. Texas reaches $1.4 billion settlement with, quote, Meta on, they say on Tuesday, but that's a couple of weeks ago, over its use of facial recognition scans, Jay. That's according to a press release. Texas Attorney General Ken Paxton. This is the Texas state government? Yeah. Wow.
[00:07:21] Anyway, Ken Paxton filed a lawsuit against Meta in February 2022. 22 said, hey, you're storing facial recognition data obtained by scanning pictures across Facebook, violating the Texas. It's called capture or use of biometric identifiers act. What do they call this thing?
[00:07:50] C-U-B-I, I guess is what it's called. And the Deceptive Trade Practices Act, and then they give a thing for that, too. That's according to court documents, Jay. What do you think of that? I don't know. So are they scraping Facebook for images of people so they can correlate with facial recognition, I guess, cameras that they have up around in the air? I don't understand.
[00:08:20] Maybe they're correlating it with DMV information. I don't know. I don't understand, I guess, a little bit on how people cannot scrape Facebook, especially like public profiles anyway, since people are specifically making it public. I mean, I can understand Facebook saying, hey, this is against our terms of services, and if we find out you're doing it, we're kicking you out. How do they make that illegal, though? I don't know. The other laws. Facebook is the one doing it. Meta is the one doing it.
[00:08:46] And they're basically using facial recognition to correlate this data with all kind of different personal information violating people's privacies. Oh, I see. Okay, you store that data. You capture it. It's Deceptive Trade Practices. Ken Paxton of Texas says you're not doing that and reached a $1.4 million settlement. Or, I'm sorry, billion with a B, billion dollar settlement over this thing.
[00:09:12] They say the settlement is the largest ever obtained by a single state and by a single attorney general in history. That's crazy. And this is a significant case in Texas or whatever. But again, it sounds like a lot of money, Jay. But really, is it a lot of money? I mean, if Meta can make $10 billion, they're going to pay out $1.4 billion. It's nothing, Jay. It's just a tax at that point. Right.
[00:09:42] So, I don't really know where all that's going. But they say after vigorously pursuing justice for our citizens, whose privacy rights were violated by Meta and their use of facial recognition software, I'm proud to announce that we've reached the largest settlement ever obtained, says Ken Paxton, the attorney general.
[00:10:05] So, I guess I still don't understand a little bit because they're using, Facebook is using facial recognition software on the photos that people uploaded to Facebook, willingly gave Facebook themselves? Yeah, but they didn't give Facebook the ability to save that information and use facial recognition and correlate it to a person and then have back-end data related to that kind of stuff and retain all that. I think they probably did if they read Facebook's terms of service.
[00:10:33] I mean, when you're on Facebook, you are the product. There's a $1.4 billion lawsuit over it, and they won. I know. They settled. I know. And they didn't do it because everybody agreed. They did it because people are being deceived, right? Yeah, and that court is probably saying, well, the laws here in Texas supplant the terms of service that are on Facebook. But, yeah, I don't know. People willingly give up so much data all on their own and don't realize it half the time.
[00:11:00] Well, so that's the problem is how much data do you give up versus how much do you understand you give up versus what are the laws in Texas and how much of that storing of that data and using it for other purposes that may or may not be agreed to. You know, that's the violating the law point, right? Yeah. Any – listen, here's what they say. Anyway, any abuse of Texans' sensitive citizen data will be met with a full force of the law, says Ken Paxton. Well, if they're not –
[00:11:27] And, again, the problem with these laws and some of these things, it gets so complicated. It's hard to break it all down. Not to defend Meta, but how do they keep up with all the laws that Texas passes? And then if people are volunteering their information, I mean, Meta could – their only choice would just be to say, all right, well, if you're in Texas, you just can't use Facebook anymore. I mean, they're not going to do that, but they could. Yeah. Meta admitted no wrongdoing according to the final settlement documents.
[00:11:56] They say Facebook discontinued the use of its facial recognition software in 2021 according to its website. Now, whether that's true again or not, I don't know. So the point is, if I upload a photo, that doesn't mean that you can use facial recognition on it and then take that correlated photo with my personal name and what do they do with it? See, that's the real downstream discussions. Yeah, but just because Facebook isn't doing it, anybody scraping Facebook could be doing it. Yes.
[00:12:26] And you're putting your photo out there on the internet saying, this is Sam Bushman, this is all my other thing. I mean, there's a lot more they could correlate just by looking through your posts and stuff. If you're an active Facebook user, then you're going to get from using facial recognition and correlating against some things. I mean, I guess that depends on how big you are and how much data you have access to, too.
[00:12:49] Yeah, I guess the bottom line is they're just saying, hey, Facebook can't be the guys to store the photos, let you upload the photos, do the facial recognition on the photos. Yes, people could do some of those things outside of that. But, hey, creating a holistic, simplistic way to do it. You know, at some point there's a problem with that, I guess. Yeah, they're horning in on the government's racket. You can't be doing all that stuff at once. Well, and my response is just because I upload a picture, that doesn't mean I agree you can do facial recognition and correlate it with my name and do all kind of things with it.
[00:13:19] Right. You know, when does that get correlated with, for example, isn't there like a Facebook marketplace or something like that? Yeah, there is where you can buy and sell stuff. So then you correlate that with the things that I bought. Then you sell that to companies that want to sell me stuff. And see, I never agreed to all that. That's true you didn't, but you kind of did. I mean, anybody can do that. Well, and that's what the debate is. Can you do that or is it against the law? In Texas, they're saying it's against the law. Right.
[00:13:48] And, you know, again, I understand your point, Jay. I'm just saying that's what this debate is about. So as long as you're in China, it's all right. A lot of this stuff is very sticky. Yeah. A lot of this is complicated. If you're in China, then you can do all that stuff with Sam's data. But if you live in Texas, you better not be doing that with Sam's data that he put on the web of his own free wishing. Yeah. New York Times has an interesting article. They said the doctors are in, Jay. They're just not where you are. They're hologram doctors. Did you see this, Jay? I did not, no.
[00:14:16] I haven't seen any hologram doctors lately. They're calling them hologram doctor technology. Is this just telehealth or no? No, no, no. They're holograms, Jay. Telehealth is where you, you know, connect and you really see a real doctor. You just see them over the Internet or whatever, right? Yeah, like FaceTime or whatever. This is a hologram doctor, buddy. I haven't seen any real holograms that have, I mean, everybody's talked about it.
[00:14:43] They've done things like run a projection on a smoke kind of, you know, a smoke panel and things like that that can look sort of like it. But I don't know they had real holograms yet, Sam. Yeah, some health care professionals are wondering if it's beneficial at all. A patient walks into a hospital room, they give this example, Jay, sits down, starts talking to a doctor. The only problem is only in this case the doctor is actually a, quote, hologram.
[00:15:13] It might sound like science fiction right now, they say, but it's actually the reality for some patients at Crescent Regional Hospital in Lancaster, Texas. I think they're being fast and loose with the term hologram. It's probably just a screen or something. It's a telehealth. That's what it really is. Well, but I don't think a real doctor is there is the point. You're saying it's like an AI doc? Yeah, but a hologram of a doctor.
[00:15:43] Oh, I get you. I see what you're saying. Like an emergency medical holograph, an EMH. Added. They say they've been able to remotely see their doctors, I guess, offering patients the ability to see their doctors remotely as a hologram. Through a partnership with, what is this thing called? Holocodex? Holoconex.
[00:16:12] A digital technology firm based in the Netherlands. So you got the Netherlands, you got Texas, wherever you are, isn't where the doctor is. They say each photobox, or I'm trying to learn these terms as I go, right? Each holobox, the company's name for its 440 pound, seven foot tall device that displays on a screen,
[00:16:41] is a very highly realistic 3D. Live video of a person. Costs $42,000 with an additional annual service fee. Okay, so this is a very immersive 3D kind of view of a remote doctor. So it's telehealth with a fancy interface. Fancier than on your iPhone, right? That's all it really is.
[00:17:12] They say that it gives you the feeling that a doctor is actually sitting inside the box. When in reality, the doctor's miles away looking into a camera. I have seen this. Not in person, but I've seen stories of this, yeah. Anyway, they want to extend the service to traditional appointments. Right now it's going to be in beta tested and everything else. And so here's the point. The hologram that you're seeing isn't really the doctor. But there is really a doctor somewhere else, you know, in a camera.
[00:17:40] So now the term that you bring up, Jay, you know, what is it really a hologram comes to mind here, right? Yeah. It's just a... Because it's not really a hologram, but they're saying that it is. I mean, for that matter... The doctor's not in that box, Jay. The doctor's not what you're seeing at all. You can put on a VR headset and do this even easier without the $42,000 equipment. No, dude. You've got to have the $42,000 unit, bro. No. It's medical. You can't do stuff on the cheap like that. Yeah, that's just like a line item, man. Get 10 of those.
[00:18:11] Sign us up. It really is crazy. Raise your health care premium so we can have holodox. Holodox in a box everywhere. Anyway, I just find it fascinating how that all, you know, works and everything else. And I personally think the telecare medicine thing is a great idea for most situations. It is for a lot. It does work well. Because especially when you need a specialist, you want to talk to somebody who can read an X-ray or read an MRI or an EKG or something. And they're maybe in a bigger city or in a specialized unit, several towns over or whatever.
[00:18:40] I think it's great. I think it's wonderful. Or even be able to just talk to your provider or one of their associates from home and you don't have to make the trip, especially if it's something simple. It's a great idea. And it actually saves a lot of money in health care. And that's a win-win for everybody. And I think it's great. And I think that the hologram, though, let's just separate this for a second. Just get rid of that box for a second. What you really have is a doctor looking in a camera at you. The hologram is just to make you feel like you're sitting in front of somebody.
[00:19:10] But the hologram is not even a real person. The doctor literally looking into the camera on the other end is, in a way, completely separate from this hologram, I don't know what you want to call it, display. It's just designed as a backup to make you feel better. Is that the idea? Yeah. And it could also be like it's an avatar where it's not really even looks like that other person, the doctor on the other end. He might be in his, you know, slippers and stuff. Yeah. And he's just doing a consultation. But then this doc on the other end, that's the holograms in a lab code.
[00:19:40] And he's all, you know, there in a suit and tie and whatever. It could be, there could be layers to this, but it all amounts to, in my opinion, it's all amounts to telehealth. It's just a fancy telehealth. It is a fancy telehealth. Although, again, this hologram thing and this virtual reality thing is interesting. Now, how much, you know, does it make sense to have virtual reality on top of your doctor's visit? To me, it almost can border on distraction and problems if you're not careful.
[00:20:06] Now, when they mix this with a super intelligent AI that can actually probably do a pretty good job of diagnosing things from what people say and show and tell. In many cases, the scientists are saying better than a doctor. Yeah, I know. And you're going to have kind of what I was alluding to earlier in EMH. You'll have this medical hologram like we saw in the 90s on Voyager where you can have this instant doc that you can just kind of pop out and help you with whatever you need.
[00:20:32] And then, you know, that could save a lot of money probably in health care too. And be more knowledgeable, especially than graduates and stuff. I mean, you can have a wealth of knowledge into one of these bots. At least even, I think, especially new people coming out of med school and things should maybe even consult or have one of these as an aid to help them with diagnosis and maybe things they might overlook or not correlate.
[00:21:28] Yeah. There's an AI behind it, Jay. Right. And that won't be long off, I don't think. And how long will it be until AI responds to the doctor? The doctor reviews it and says yes. No. That's already happening. Hold on. That's already happening. Yeah. With x-ray and MRI readings, that stuff's already happening. They have AIs looking at this and finding things that the doc may have missed. And pointing things out to the doc in real time. Yeah, exactly.
[00:21:58] And then how long you got before this? Is this a problem? And it's incredible. How long do you have before then the doctor just kind of becomes this vestigial yes or no man to the thing and the accuracy of it already becomes super accurate to where they'll just cut the doctor out at some point? Well, and at what point does the doctor, if he gets trained as a doctor and then he gets released and then he's just the yes, no man over time, does the doctor really have the ability
[00:22:25] to analyze these things like the old days when you really had to pay attention and think about it and go through it in your heart and mind? And what I mean is at some point that changes, right? Yeah. At some point he's not going to second guess the AI either if he thinks or knows or has seen in his own experience how accurate it is and the things that it has seen that he has missed. You know, you could see a road down which in not too long the humans could be almost like
[00:22:53] trusting too much into AI and other things and not have their own skills. Amen to that. I guess CNN launched the digital paywall, Jay. Really? As if their viewership is not low already. I don't know where all that goes. They want $3.99 for it and stuff like that. They're starting with kind of the price low. Well, it depends on who you are. They say it's just the first step in their journey, Jay.
[00:23:21] If you have a lot of money, you could probably get more expensive memberships just to support them. Yeah. Anyway, I just find that really interesting though because paywalls are becoming more and more and more common. Paywalls are dumb. They're big boys. They're big boys. Online content, I think, in a lot of ways. Unless you have really specific stuff that only a specific group that, for example, you look at like college papers and research stuff. Some of that, a lot of that's behind a paywall and people need it for certain reasons and things
[00:23:50] and they're willing to pay because they're trying to get their doctorate thesis or whatever done. But paywall for news stories, anytime I see those anymore, I just bail on it. It's like, well, that story is not important enough to look at. And I'm not saying I support this, but there's even technology too where you can press a couple of buttons and go to places that will simply circumvent the paywall. In a lot of ways, you can just hit on your browser reader mode and it gets rid of the pop-up that's blocking what you're trying to see anyway.
[00:24:16] But, you know, it's this, I don't, unless you've got a dedicated core of people, there's other ways you can approach this, which maybe we can talk about in a future show. Paywall, no way, for TechWatchRadio. NPITechGuys.com. We keep an eye on tech so you don't have to. Brought to you by NetworkProvidersInc.com. You've got a friend in the IT business. Thanks so much. Make it a great tech day, will you? Hey, thanks.


