Reels & Risks: The Cyber Pulse
NPI Tech GuysMarch 27, 20240:24:5022.74 MB

Reels & Risks: The Cyber Pulse

* Instagram is working on a feature that would allow you to let others put a ‘Spin’ on your Reel - Aisha Malik.

* Instagram is working on a “Spins” feature for Reels, its short-form video TikTok clone, the company confirmed to TechCrunch.

* Suspect Your Computer Has Been Hacked? Do These 5 Things Now!


[00:00:00] Yeah, baby. Another edition of TechWatch Radio coming to you for the studios of NPI Network

[00:00:55] , Jay Hill, headline Instagram is working on a feature that would allow others to put their own kind of spin on your real.

[00:01:07] Now this is going to get wonky kind of quick I think. So if I have an Instagram reel I guess Jay, that's what it's called right?

[00:01:15] Yeah.

[00:01:15] And I put it out there and then you can do your kind of take on my real so to speak.

[00:01:21] Okay, so this is more than just a reaction or something it's more like a like a rebuttal or what do they call that in TikTok?

[00:01:31] We were just talking about in a previous episode TikTok but there's a term for when they repost or pin or whatever.

[00:01:37] This is more than that because the you put your quote spin quote is what they got so spin kind of means something in their

[00:01:43] language or whatever on your real so they can put their own spin on your real and how flexible and what control you have to put

[00:01:50] what kind of details into the quote spin because I guess if you're going to create a new term is it a whole new video?

[00:01:58] Is it a what I don't know what it's going to be like I haven't tried it yet.

[00:02:01] Yeah, I think it's going to be just basically like where you can quote it and kind of pin that one and be able to reference it.

[00:02:09] So you take a little excerpt out of it, talk about it, you know, play parts of it if you want to but then it links back to the original video.

[00:02:16] It sounds very similar to kind of what TikTok's doing.

[00:02:19] So in a way, you know, you could say great new technology.

[00:02:23] In a way you could say hey maybe they're just playing ketchup, right Jay?

[00:02:26] Yeah, it depends on how feature rich it is meaning can I put my video in line with them or can I put me on the screen talking at the 710 they are and pausing them like I'm talking to them directly or how advanced can it be?

[00:02:38] Yeah, we'll have to see.

[00:02:41] I like it when they bring out new features though. I think it's pretty cool.

[00:02:45] Well, and whether they're playing ketchup or whether they're kind of blazing the trail that details are in the features which we haven't seen yet is the point.

[00:02:52] Yeah, we will check this out and keep an eye on it and let you know though.

[00:02:55] And the point is they're working on this feature and it could be kind of cool when they say you can put your spin.

[00:03:00] It really depends on what is their version of spin?

[00:03:03] What does that mean in this app in terms of its functionality and capability?

[00:03:07] For example, can my spin include me and AI together?

[00:03:11] Sure. Why not?

[00:03:12] So can I fact check you with my spin using AI on the fly to research and have AI tell me the truth and see if it jives with your truth?

[00:03:20] I don't know how to look at that interact.

[00:03:23] I like how you said your truth.

[00:03:25] Yeah, I don't mean yours is in j's.

[00:03:28] No, no, no, but we all have our own truth right?

[00:03:31] Well, the truth is the truth.

[00:03:33] It's not your truth, but the point is if you say something publicly, let's say you're a politician, you say something publicly, I'm going to call that your truth right now because if it's really true or not.

[00:03:43] Yeah, but you're taking it as a lie.

[00:03:46] Okay, so now it's a debate and sometimes there's a right and wrong, a true and false.

[00:03:50] Other times though, there's, I wouldn't say there's multiple truths.

[00:03:53] Well, there can be matter of opinion.

[00:03:55] Yeah.

[00:03:56] Your favorite color might be red and mine's green, but that doesn't make either one right or wrong necessarily.

[00:04:02] Yeah, but that's not like a matter of you know, did this event happen at this time kind of a thing?

[00:04:07] Right or if we both go to an event, we may come back and tell different stories, but both be telling the truth.

[00:04:12] Yeah, from based on perspective and how that's where it yeah.

[00:04:17] Anyway, it's very interesting.

[00:04:19] We'll just kind of see what this means.

[00:04:21] If TikTok gets banned, where will people go?

[00:04:24] I don't think it will be banned by the way.

[00:04:26] That's from the last episode just so you know.

[00:04:28] But now Instagram working on this feature and I really wonder how much of a tweak and how.

[00:04:33] I don't know what you want to say for example, let's say Jay does his video.

[00:04:38] Then I want to do my quote spin on it.

[00:04:40] Can I press a button and make me look and sound like Jay in my spins my AI generated Jay on my behalf?

[00:04:46] I hope not.

[00:04:47] No, I don't think it's going to be that at all.

[00:04:49] And as the person who originally created the video, I don't think you're going to have any control either over how they use it or what they do with it.

[00:04:56] I think this is just a way to make, to get people to create content because when you get people working off of the chairs and that interaction.

[00:05:05] But our users are going to are going to make a version doing exactly what I'm telling you.

[00:05:09] Well, yeah, they will AI can do that kind of stuff, but that stuff is getting almost quasi illegal.

[00:05:14] I mean, remember, I think we've talked about the people impersonating Biden and making all those phone calls and stuff and they got slammed down.

[00:05:21] It's easy to go, but even so as spamming is a little bit done all the time for big money.

[00:05:25] So you just watch big money behind this and there will be a I don't know what you want to call an evil side to AI and underground kind of whatever you want to say where there's big money in distraction, deep fakes, whatever word.

[00:05:37] I'm telling you right now this points to that big time in terms of the starting.

[00:05:41] I know this isn't that, but I'm telling you that's where it leads.

[00:05:45] Anyway, there you have it.

[00:05:47] Right.

[00:05:50] What if your computer's been hacked?

[00:05:52] People are just like, you know what I hear this all the time.

[00:05:54] First off, you need to know if it's really true or not.

[00:05:56] Yeah, a lot of people think their computers hacked and it's just because they clicked the wrong button or whatever.

[00:06:00] Well, or people think their computers hacked because you know it's running slow or it's this and that and it may be hacked.

[00:06:06] Right.

[00:06:07] But you've got to just step back and say let's just prove if it's hacked or not and there's ways you can kind of know.

[00:06:14] You can get a digital person in there to kind of confirm, hey, you know what?

[00:06:17] There's no, no this stuff on here with all the tools that we know of and everything else.

[00:06:21] But short of that, there's a lot that you can do yourself to make this the case.

[00:06:25] So someone just says, Hey man, for example, I had a person that I'm really close to email me.

[00:06:30] No, no, they texted me the other day.

[00:06:32] They said Sam, have you been hacked?

[00:06:34] I'm like, no why?

[00:06:35] They're like because we're seeing all these things.

[00:06:37] We're getting pictures of you and then your name randomly.

[00:06:41] And then when we, you know click on it or whatever, it just goes to something entirely that doesn't relate.

[00:06:47] And so my first thought is no I haven't been hacked but you have right because I'm just going okay that's you know.

[00:06:54] So then but I wrote back and I said what technology?

[00:06:57] What technology are we using to determine this?

[00:07:00] Are you seeing these in Facebook or on your text or email?

[00:07:04] Where are you seeing the and I haven't got the answer to that detail yet?

[00:07:07] But I bring that up because that's how you can kind of troubleshoot.

[00:07:10] We call them right now, Sam on the air and find this out.

[00:07:13] We got to get to the bottom of it right now.

[00:07:16] We really should.

[00:07:17] You're probably right about that.

[00:07:20] But anyway, if you suspect your computer has been hacked network providers put together kind of a newsletter article that talks about what to do.

[00:07:30] Do these five things now that your computer has been hacked?

[00:07:33] It's pretty good summary of kind of what matters here.

[00:07:36] J I think on this topic.

[00:07:38] They do they've got a five step article here that you can, you can subscribe also to these newsletters and get some of this information delivered right here in box two.

[00:07:47] But number one, they say take the network offline and isolate the incident but do not turn off the device or reboot it when a device is working the way it should be.

[00:07:57] The go to move is hit restart, but in many scenarios this maneuver may or may not work if malware is involved.

[00:08:03] The simple act can make the situation worse and some circumstances rebooting your device can set in motion crashed file encryption process that can make unrecoverable data loss.

[00:08:12] So disconnect your device from the network, but allow to remain on as you move through the steps.

[00:08:17] Is there advice?

[00:08:19] I happen to take issue with that a little bit though because I think you should shut it down right away.

[00:08:24] And the reason for it is this.

[00:08:27] I've seen, you know, I actually think that it goes both ways and you just, you can't know because it depends on how the malware is going to work.

[00:08:33] Sometimes that can put things into motion that you can't stop but a lot of times let's say that you discover malware is happening and it's encrypting stuff in the background.

[00:08:42] If you shut it down and stop it and then open that drive on another computer externally not booting from that drive.

[00:08:49] And you may be able to recover stuff that it has not gone through the encryption process with if it's chunking through data.

[00:08:55] So there's pros and cons to that.

[00:08:58] Yeah, because the other side of it is there are there is software there is capabilities for when you shut that down on either one wipes everything and or two can encrypt everything in ways that you almost can't get back.

[00:09:10] That's right.

[00:09:11] I think it's the most because I'll never say never but it's really hard to get back.

[00:09:14] And so there's two sides to that.

[00:09:16] I agree with both sides.

[00:09:17] I mean, I don't really have an answer.

[00:09:18] I think both of them are issues right?

[00:09:20] Yeah, I think they're both legitimate.

[00:09:21] I'm not taking sides on it.

[00:09:22] They're legitimate.

[00:09:23] I don't know how to answer or resolve the two either.

[00:09:25] I'm more of the shut it down or number one what I can totally agree with is get it offline.

[00:09:31] Disconnect the avenue or the pathway that they have to connect to your hardware whether it doesn't matter whether it's a light bulb or computer or whatever.

[00:09:39] Yeah, get it offline.

[00:09:40] And then you have to make the call.

[00:09:42] Do you want to shut down?

[00:09:43] You need your expert to look at it.

[00:09:44] I guess I look at it this way if you're not an IT person, you probably want to get your expert to look at it.

[00:09:49] If you are and you know what you're doing, you may want to just shut that thing off and then examine it externally and see what you can at least recover or get from that device.

[00:09:57] But yeah, the other thing is the reason that they're telling you not to shut it down also.

[00:10:02] This assumes that you have an IT team.

[00:10:05] Then you put in your protocols that say when there's a breach or a problem, here's the actions that we take.

[00:10:10] One of them is we get a hold of our IT partners immediately and then our IT partners can look at it.

[00:10:14] And if it's a remote connection, if you shut it down now, if we need to look at it remotely, then you go, you're telling me now I got to boot that thing up again.

[00:10:22] Right.

[00:10:23] Now you shut down and rebooted since would rather see it with the problem but untouched.

[00:10:29] So if you've got an IT guy that you can bring on quickly and that's the real value of IT professionals and everything else to say, you know, hey I can get a hold of my IT guy.

[00:10:36] They'll be on my computer in less than an hour or whatever the case may be.

[00:10:39] Then it's like, all right, I should leave it on let them at least look first.

[00:10:42] Exactly.

[00:10:43] Which is if they decide to shut it down then they do and etc.

[00:10:47] But you don't want to go through that whole process if they can see it fresh.

[00:10:50] Now if you don't have an IT person, you're like, I'm going to have to call him.

[00:10:53] It's Friday. I'm going to have to call him.

[00:10:55] Shoot. I'm out town on Monday.

[00:10:57] I guess I'll try to call him on Tuesday and then they can't get there to say Thursday because you turned it off or whatever.

[00:11:03] You know, you're taking this cycle that could have been really simple and making it very complex.

[00:11:08] So I agree with the first one, although I see both sides of the discussion.

[00:11:11] Yeah.

[00:11:12] And number two is call your IT team immediately.

[00:11:15] It's important to contain the breach before it infects the rest of your network and causes any more damage again while you want to pull it offline.

[00:11:21] Your IT team will be able to investigate the issue, determine what went wrong, what the impact was and mitigate the breach quickly.

[00:11:29] They say don't try to fix this on your own attempting to run system cleanup or your antivirus software can some most of the time only waste time and cause more damage call the experts immediately.

[00:11:39] And it's the kind of thing too where if you have remote people that can look at it, they can look at it remotely very quickly.

[00:11:43] We're scheduling a time to come out may take more time.

[00:11:46] And so that's kind of the reason to say, hey, keep it on.

[00:11:48] Let us just check real quick.

[00:11:50] Then we'll make an decision and tell you what we think is best.

[00:11:52] Hey, you know what? Minds will just turn that off or you know what?

[00:11:56] So there's ways to do that.

[00:11:58] And the problem is how do you disconnect right away and let us see it?

[00:12:02] The answer is you disconnect immediately.

[00:12:04] Then when we need to see it, we connect it just for a minute for us to just check.

[00:12:08] It doesn't take us too long to at least get a handle on things.

[00:12:11] And then we can tell you whether you should, you know, disconnect it, turn it off or what you should do from there.

[00:12:16] And that way you get kind of the most professional viewpoint that we can get about the choices right?

[00:12:21] Exactly.

[00:12:22] All right. Go ahead, Jake.

[00:12:24] Step three is call your attorney depending on the size of your breach attorney may want to refer you to outside legal counsel with privacy and data security expertise

[00:12:31] who can advise you on federal and state laws about the impacted breach.

[00:12:36] So if you know this is a breach and you know what's going on, you're going to be wanting to get on the phone with your attorney pretty quick.

[00:12:42] Yeah, because there's a lot of laws that relate to breaches or potential breaches and stuff, especially if you do with credit card data

[00:12:49] and or like pharmacy data or any personal data if you're a doctor, you know, this kind of there's all kinds of reasons that

[00:12:56] and you've got to have people that know these things and know what to do. And the best thing to do is get an attorney and say, hey, am I good on this?

[00:13:01] And, you know, they can let you know and that might be overkill for some companies too.

[00:13:06] I mean, I get that it's who has an attorney just on speed dial right.

[00:13:09] If you're a small company, but it is important.

[00:13:12] And we mainly want to point it out because we don't want it to be something that you could have done that you didn't because you didn't know.

[00:13:16] So we want to tell you and if it doesn't really apply to you then, hey, I'm a self owner operator.

[00:13:21] You know, I can get an attorney if I need one, but I don't have one. You may not want to spend time on it, but it is something critical and important to be aware of that it does matter.

[00:13:29] And there are laws and rules and guidelines that people have ignored that and tech forever, but I'm telling you it's coming to a point where you won't be able to ignore it any longer.

[00:13:37] You're going to have all kinds of troubles.

[00:13:39] So tech is changing fast in that regard indeed.

[00:13:42] All right, next one what changed passwords and secure accounts day.

[00:13:45] That's right. Change passwords your IT team is working on containing the breach and you're going to want to make sure that all your accounts are secure.

[00:13:50] Now, I'd recommend you get on to a different device. Of course, don't try to use the one that's having an issue but make sure that you've got multi factor authentication involved.

[00:13:59] You've got whatever accounts are things that are being that a problem.

[00:14:04] You're getting that stuff changed. You need to begin working through all your accounts.

[00:14:07] Make sure they're secure. Start with ones that contain financial information like credit card numbers, social security numbers prioritize those first.

[00:14:16] Amen. And it's very important to have appropriate passwords. You can't have bad passwords. You got to have good strong passwords.

[00:14:24] There's a debate whether you should change your password often or not. I'm in the camp that says don't just make it unique enough towards not a problem.

[00:14:31] I'm not saying changing once in a while is a bad idea, but I'm just saying I wouldn't have consistent changes.

[00:14:36] That's a disaster waiting to happen to get yourself locked out of your account one mistake every time you change it.

[00:14:42] So I think that's more dangerous personally. I agree. It also usually generally makes people have weaker passwords where they're just incrementing a number or something like that.

[00:14:50] Actually, that security practice, I say in quotes has kind of been ruled out, but you saw a lot of governments and hold other stuff that saw that originally that came out like in the late 80s, early 90s and are still adhering to that.

[00:15:04] Oh, you got to rotate your password every 90 days that really makes things less secure. You're better off using a long strong good password and not locking yourself out.

[00:15:13] Not only less secure, but it takes a chance that like I say, if you work for a company that has, you know, 50 hundred 200 whatever people, hey, man, someone's always locking themselves out. And it is a nightmare to solve.

[00:15:21] And you all, I mean, look at the cost of just dealing with that because, you know, you have three, four people a day who can't get in now.

[00:15:29] And then you've got to have somebody just dedicated to fixing that stuff and spending an hour or two a day.

[00:15:34] They're not doing it on purpose. They didn't try to screw up, but just, you know, people get locked out. It's like, I don't know why the I swear I say the password in here and supposed to, but it doesn't work.

[00:15:42] And now I got to, you know, that all the time, right? Yeah, it's a poor practice.

[00:15:46] All right. Number five, check your bank accounts. Jay.

[00:15:49] That's right. Not all cyber tax are financially motivated. However, making your bank accounts.

[00:15:54] The primary target is often what they do. So as the breach is being mitigated, check all your bank accounts and pro payment processing tools, including third party merchant accounts and all employee payroll, everything for any anomalies or sudden changes.

[00:16:07] You want to make sure that you're on top of this stuff more than so than ever while you're dealing with the aftermath of an attack like that.

[00:16:15] Amen to that. All right. Zion's bank emailed me a security thing that I found pretty interesting.

[00:16:22] And I thought I'd share it with you really quick. Jay, this is interesting. Zion bank in our efforts to enhance our security and security awareness amongst our financial partners.

[00:16:32] We wanted a shed light on a new threat. They call it a persistent cyber threat, Jay known as smishing SMI S H I N G.

[00:16:47] Heard of it. Smishing I have. Okay, most people haven't heard about it. I've heard about it, but it's kind of interesting and they say understanding this threat and taking preventative measures is critical for not only individual, but especially business security accounts.

[00:17:03] Now it's cool that my bank is sending me this by the way because I'm glad they're like on it too. And I know about this. I'm just telling you that it's neat to see them really send this out to their consumers now because I'm just at the bank.

[00:17:15] I think I'm not anybody special. I'm just a guy. Well, so I got it just as a consumer. What is smishing than they highlight this and say a blend of words SMS if you will and fishing,

[00:17:31] and smishing refers to the fraudulent attempts to basically take sensitive information via text messages. They say these messages might be text, but they might come from banks or entities or even make you think they're from your self carrier or legitimate sources because you can't know Jay.

[00:17:51] That's right. Yeah, it's socially disappearing just to try to trick you over SMS because you automatically think that especially if they spoof a number or whatever that it's legitimate and they can get you to click on links and or divulge security codes like two factor stuff or whatever.

[00:18:09] Yeah, so it's basically a blend of using SMS text words this that to kind of blend this together. And so we want to seek for information to get you to divulge information for us usually financial related.

[00:18:24] How do you identify the smishing so to speak suspicious senders one of the biggest ways. If it comes from unfamiliar number, a shortened number, etc. Look out.

[00:18:36] Okay, oftentimes these messages carry a sense of urgency. Oh, you got to do this or you got to take action now somehow. And they always oftentimes request personal information.

[00:18:50] Sometimes they'll do a preliminary one that's not too personal and get you to go along and then it becomes these personal information. So the point that I'm getting at is if you see this along the way and you kind of go, well, the first one had emergency kind of but they didn't have the other factors.

[00:19:04] Then you get another one that requests personal information and you go, wait, okay. All these things should be treated with suspicion not in a just enemy way but just a way that says look, I'm on the lookout all the time. You got to be so cautious stay skeptical.

[00:19:19] You are in the internet world driving down the streets of pretty unsavory places sometimes without realizing it is the point. And you can protect yourself from smishing. Is that the way they say it?

[00:19:30] Yes, if you're Sean economy. Anyway, I'm just having fun. I'm smishing. But anyway, this Smith Smith Smith thing. Now I can't even say it right anyway.

[00:19:44] I'm not smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing.

[00:20:14] I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I

[00:20:44] I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I

[00:21:14] I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I

[00:21:44] I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I

[00:22:14] I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I

[00:22:44] I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I

[00:23:14] I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I

[00:23:44] I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I'm smishing. I

[00:24:14] smishing.