#462 Framed by the Algorithm: Tim O’Hearn on the Dark Psychology of Social Media

What happens when the very platforms meant to connect us instead manipulate us?
In this eye-opening conversation, Tim O’Hearn — former black-hat growth engineer, bestselling author of Framed: A Villain’s Perspective on Social Media — joins Mehmet to expose the hidden mechanisms behind social media addiction, fake engagement, and the dark psychology driving today’s online behavior.
Whether you’re a startup founder, tech leader, marketer, or a concerned user, this episode will shift the way you think about algorithms, content, and influence.
Key Takeaways
• Why all social platforms eventually get gamed — and why they let it happen at first.
• How social proof, dopamine loops, and algorithmic feedback traps rewire our behavior.
• The blurry line between black-hat marketing and everyday online tactics.
• How social media evolved from community to manipulation machine.
• Realistic strategies to protect yourself and your mental health online.
⸻
What You Will Learn
✅ How black-hat tactics exposed deep weaknesses in major platforms
✅ Why being controversial now beats being authentic in the algorithm game
✅ How kids and adults alike are losing control to tech-designed addiction
✅ Why the future of the internet might feel “dead” unless something changes
✅ Practical tips for healthier digital habits
About Tim O’Hearn
Tim O'Hearn is a software engineer who created some of the most pesky and effective bots to ever be unleashed on social media. Between 2017 and 2022, his agency gained millions of followers for its clients while generating hundreds of thousands of dollars in revenue.
His February 2025 debut, Framed: A Villain's Perspective on Social Media, was a #1 New Release in "Social Aspects of the Internet" on Amazon. Framed is Tim's confrontation with the Internet Age delivered by a video game cheater who outgrew gaming but never stopped breaking the rules.
Tim has spent his career working in quantitative trading and has freelanced as a sports journalist.
The book:
https://www.amazon.com/dp/B0DW2X8YSK
Tim’s Website:
https://www.tjohearn.com/links
Episode Highlights & Timestamps
00:00 — Intro: Welcome and overview of today’s discussion
01:00 — Tim O’Hearn’s journey: from black-hat engineer to bestselling author
04:00 — Building bots that beat the system: Early vulnerabilities of Instagram
08:00 — The psychology of fake followers, likes, and social proof
11:00 — How AI is changing (and worsening) manipulation dynamics
14:00 — Growth hacks vs black-hat: where’s the line today?
18:00 — Is the algorithm pushing creators toward extremes?
22:00 — How the algorithm manipulates our emotions and behavior
27:00 — Who’s really to blame: us, the platforms, or someone else?
32:00 — Are kids the biggest victims of addictive tech?
36:00 — The lost magic of early internet communities
40:00 — Dead Internet Theory and the rise of AI-generated everything
45:00 — If we could redesign social media from scratch: what Tim would change
48:00 — Final thoughts: How to regain control in a hyper-digital world
[00:00:00]
Mehmet: Hello and welcome back to a new opposite of the CTO show at Mead today. I'm very pleased joining me from New York City, Tim O'Hearn. Tim, the way I love to do it is I keep it to my guests to introduce themselves. So tell us a little bit more about you, [00:01:00] your journey, and what you're currently up to, and then we can start the discussion from there.
Gonna be like a great, uh, topics or topics we are gonna be discussing with you today. So the floor is yours.
Tim: Thanks MeMed. I appreciate that. My name is Tim O'Hearn. Uh, I'm a software engineer who wrote the book framed a villain's perspective on social media, which has been a number one bestseller for social aspects of the internet.
Um, on Amazon Kindle. For me, writing this book was a really, really unique, uh, path in that I wrote it mostly while working full-time actually in the quantitative trading space. Uh, it covers my experience basically between 2017 and 2022, uh, of running a black hat, social media growth, uh, company. The book addresses both my personal experience there and some of the larger, um.
Technological aspects and social aspects of, uh, breaking the rules on social media and maybe this push and pull between the [00:02:00] customers, the advertisers, and the platforms themselves.
Mehmet: Cool. It's gonna be a really interesting discussion. Um, so the title of the book, right? So, uh, it's like framed and. I'm sure you know there's anything we do.
So whether we start a company, we choose to work in some place, there will be a story behind it. Right? Uh, so what made you really want to tell the story and from the villain seed I would say.
Tim: I had written a lot about what I had done, um, on Instagram specifically and on some other social media sites. I had written these things as blog posts, as essays, as like unreleased, uh, diary entries.
And as I compiled more and more material, I started to think that the angle really wasn't to provide a social media guidebook. There are thousands of those out there and most of them are very [00:03:00] low quality. For what it's worth, I thought it was much more, uh, useful to put this along the same track or like the same story arc of.
Maybe something like Chaos Monkeys or Disrupted, or some of the other books that are combining the sociological aspects with a light memoir or like a light, uh, journey through, uh, the industry. Rather than saying, I'm an expert, here's a top 10 list or checklist of things you should do, I realized that I was definitely more so the villain, uh, than I was, uh, you know, the, the savior or the, the good guy or the white knight or what have you.
For me. We broke the rules and we made money. And why This is unique in that you'll find a lot of the other social media books or technology, big tech books in general. They're usually written by somebody who is more a professional writer than they are a professional rule breaker or even a power user of the platform.
So I thought I could combine a decent, uh, writing ability with the fact that I actually [00:04:00] did it. I didn't need a research team helping me. I didn't need a ghost writer helping me. Um, I was able to put it all together myself, hopefully in a way that's viewed, uh, with some measure of authenticity.
Mehmet: Great. Now let's go and start to uncover with you, Tim, you know what?
I like the way you framed it actually as also came from the book name, so the dark side of, you know, social media. So, and. Like you've built bots that outsmart some of the smarts platform, like you, you, you did a lot of stuff. So what kind of learnings, you know, you can share about the weaknesses, uh, of these networks.
And you mentioned Instagram, but I think, you know, we, you can share like more on for maybe other platforms also as well.
Tim: The main weakness is that. Regardless of who is building the platform, the last thing someone is thinking about when they're creating that [00:05:00] minimum viable product, when they're thinking about that beta version, that first release, the last thing they're thinking about is how do we stop spam?
How do we stop bots? How do we stop rule breakers? All platforms really care about at the start is more users and more users who are stickier, meaning they engage more and they stick around longer. As time goes on, uh, people will find ways to use platforms for their own benefit. And sometimes this happens in ways that are highly disruptive.
So spam works because unlike sending junk mail, which every, you know, every piece of junk mail maybe has a small cost to it. Sending junk email doesn't really have a cost in the same way. So to a spammer in the early aughts in like 2004. They could have sent millions of emails, uh, at a cost that was quite trivial.
And the idea was that even if most people ignored them, uh, they kept sending them because they worked and taking these practices to social [00:06:00] media. Then 10 years later, in like 20 14, 20 15, we were finding that the same exact things were true when Instagram launched. And remember, Instagram had a very, very small team up until it was even acquired by, uh, Facebook.
They weren't as well prepared to deal with some of the rule breakers, and that's spam. And that's also things like link sharing. Um, and it's also comes into darker things which are like harassment or communication or certain topics that are much, much more like organized crime based or crime syndicate based.
Uh, so we were thankfully on the, the much lighter side of that. But our experience and what we see today is that these platforms prioritize so much over. Keeping things safe and keeping the user experience clean. Uh, they just want more users. So of course they might, they might have a hundred engineers working on features and building out other things.
They might only have two or three people working on, uh, holding back the floodgates of, of. You know, users with, with bad [00:07:00] intentions. Um, we've definitely seen a reversal, uh, even in the later 2010s where there was so much pressure, um, not only from the user base, but also from advertisers, also from governments where platforms were pressured to moderate content better and to even establish what is appropriate, what is not.
So within that, it's not about illegal. Versus legal. Uh, oftentimes it's about terms of service and community standards, which are actually significantly harsher than just black and white, lawful or unlawful.
Mehmet: Right? Now, one of the things also you talk a lot about is, and I think this is what pushes people sometime to go for these, let's call them black hat tactics, right?
So. You know, the idea of having followers, more followers, or the idea of having more likes or the idea of having, um, [00:08:00] you know, um, I don't know, like kind of presence there. So how much of this, you know. What we see is real and how much is it fake? And where this, that's call them black hat tactics play a role in this.
And are they still till today applicable? Because now, you know, we talk about the age of AI and that, you know, the AI is like, you know, can, can monitor these things in a better way. So what's your take on this?
Tim: Social proof, uh, has been really, really important. From the dawn of Web 2.0, from the very first sites, even prior to MySpace, there was, uh, value, uh, being ascribed to follower account and then to things such as views.
Uh, there's a million different ways that we can kind of accumulate, uh, popularity on the internet. And as time has gone on, these metrics have become more and more prominent. Um, badging such as the blue check mark [00:09:00] has become much more prominent and common, uh, and people are willing to do some crazy things, uh, to to reach popularity in their eyes, which they hope provides social proof, which either boosts their candidacy as an influencer and the rates they can charge, or perhaps the success of their business.
So, looking at the environment today. Things have changed quite a bit and also from when I started writing my book till today, like Instagram's user base has probably doubled. We're talking about billions of people now. So things are a bit different in that platforms have been forced, uh, to take. Much more strict enforcement actions.
And what's forced them is not only their own sanity of like maintaining a good user experience, but it's also coming from advertisers who say, oh, we don't wanna advertise on your platform. If you have content that looks like this. And that's kind of a form of manipulation. You could even say [00:10:00] that it's a form of like.
Propaganda, uh, depending on who's making that demand, and the customers really never hear about that. Uh, in the same vein, you have governments who are now much, much more interested in privacy and of mental health, specifically of teenagers and children, uh, who have also come in to, to hopefully enforce some guidelines.
So now you have this mishmash of different types of requirements of like what should be happening on the platforms and what duty the platforms themselves have, uh, to enforce it. Overall, uh, on Instagram, there is much less, uh, objectively fake, uh, abjectly fake behavior, uh, on the, on the platform because it is today much harder to scale a bot operation.
Uh, on, on Instagram, specifically on other sites, uh, there's still some, uh, but for example, most people on LinkedIn. When's the last time you saw a clearly fake? Bot account, uh, on LinkedIn trying [00:11:00] to do a fake job scam or something like that. I would say it's quite rare. Um, but earlier, uh, on other sites, like you would see these types of phishing bots and like fake product bots, like things like that.
Even things selling like gray area pharmaceuticals, that was really common. So I think between actual terms of service, getting more strict, those terms of service being enforced, um, and like actual bot things, uh, the internet is. Slightly better. But now we have other challenges such as AI generated content, which I barely touched in my book.
Mehmet: Okay. Before coming more to the AI part, and this is something I see a lot, and I'm not sure if this is part of the delusion that people try to do, or is it, uh, is it like. Uh, considered as a black hat marketing or something like this. So one of the things we see a lot, uh, and I see it more [00:12:00] on X and on link it in more recently, which is, you know, this approach of putting something.
Now I'm not debating if this content, or let's say this. Uh, magnet, you know, for people is genuine or not, but you know, when you see people saying, Hey, like, if you want to get this, like, and retweet or like, like, and follow. And so can we consider this as a black hat marketing? Like, uh, how do you, what's your take, let's say on, on this team?
Tim: For me, black hat means breaking the terms of service and if not breaking the terms of service, violating social contracts. Chapter nine of my book, which is called Punishing Crime. I talk about the whole spectrum of black hat to illegal and everything in between. And the fact that we're [00:13:00] not calling it purely evil, uh, because it's not, and a lot of times there isn't a victim or in other cases it's just kind of disruptive and people would be surprised to know that there are tons of businesses that exist even today that are black hat.
Do violate terms of service and do violate legal precedents, uh, based around, you know, copyright and other things. A good example would be virtually any, uh, value added service that integrates with, uh, LinkedIn and does like LinkedIn scraping, LinkedIn lead generation. Anytime a service is, uh, commercializing, scraped data like that, there is a court precedent that it's illegal and in the past.
Anyone who, or specifically one company called hiq that became very popular, uh, actually was sued by LinkedIn and that's what created the precedent. But the funny thing now is that it's an exhausting process for the in-house lawyers and external counsel as well. [00:14:00] Uh, so you just don't see all of these new startups being snuffed out even though there's millions and millions and millions of dollars.
Industry to answer your question more specifically, uh, is it black hat when people are saying, Hey, like, and subscribe, and it really feels like. Uh, awkward. Like it feels like it's, it's a part of the game now where rather than watching an ad in a YouTube video, you have this inserted clip where someone's saying, Hey, please, like please do this.
Please do that. This is more creation of how competitive, uh, it has become to ascend these like algorithmic, uh, exposure engines. So really like the initial exposure that someone gets might no longer be. Their entire follower base, it might no, no longer be their entire subscriber base. By default, a content creator or some, you know, someone like you or someone like me who has quite a smaller audience, if I have 1500 followers, uh, or connections on LinkedIn, there's no guarantee that those 1500 people will all see the content.
It [00:15:00] might only be shown to 150 of them first, and then depending on engagement rate, it might be shown to a wider span of my audience. And then if it's um, liked a lot, then it might be shown to like a more general audience. So the issue is that because more people are aware of this, they know that engagements can kind of help them out.
Uh, they're doing a lot more to increase that engagement rate to kind of ascend through each gate and gain much more exposure.
Mehmet: I'm not sure if you know, it's a valid question to ask then. Uh, Tim, do you think that the way the algorithms are designed, and I know like algorithm works differently, uh, between all these social media platform, it's forcing people actually to go and try to hack the algorithm itself.
So like I've been into, and I'm not talking black hat in a sense of a black hat, really, where to make sure, for example. This post will be shown to more [00:16:00] people, so we used to come. And I'm talking about corporates here, and I think they still do it where they say, Hey, like, once we put this out, everyone like, and comments, so it appears more.
Mm-hmm. But to me, you know, so the authentic, the, you know, I don't know how to call it, like, this is fake. You know, like I, I, the reason I didn't like it much is because it's like a fake thing and we are trying to hack the system. Now at the same time, I understand people that they feel. That are shadow ban, right?
So they say, you know, LinkedIn is not showing my content, X is not showing my content. Like the only solution for this is to, you know, press the bell button on the mm-hmm. Right side next to the name. So is the algorithms of social media broken in your opinion?
Tim: It's gotten to a point where it's created or incentivized.
Bands [00:17:00] of influencers, especially in certain niches that I would argue probably should not exist. What I mean by that is we have people who understand how difficult it is. To get exposure on the internet. And in 2025 and in 2024, what we're seeing is people tending to appeal to more extremes, which we see more in the like general influencer space, like those making content for teenagers.
We see this in musicians where it seems like why are the musicians suddenly doing all this super controversial stuff? Why are people who I used to know and love and adore posting controversial content? Uh, it's because controversy and really splitting from the norm, uh, is how you drive this, like organic virility, um, which isn't, I should say, virality.
Like it's not exactly [00:18:00] how it was meant to go when you went viral in 2009 or 2010. It was organic and it didn't happen day one in most cases for those early YouTube videos. They were later discovered and then more organically promoted, shared from person to person up the chain of, of influencers of the time, and that's how we had a lot of our early viral videos.
It wasn't from day one, it just rocketed to the top of the charts where now as you said. That's the only way to really be successful, and whether you're going for a mega hit or just like, Hey, here's our new product. There's so much pressure even on friends or even on like close collaborators to like the content quote unquote, for the algorithm.
So it's created a very, very unfortunate state where small time creators. Our, I belief, like [00:19:00] compromising their values and creating content that potentially is corrupting, um, and actually in my opinion, is offensive compared to the things that I saw earlier on the internet. It's a little bit more relevant in the entertainment space than I would say the more like tech-based space that you and you or I exist in.
But you see that too, like your LinkedIn feed is probably filled with. Rule breakers or people doing the more gray area stuff. So specifically like interview coder, um, that like ai, uh, you know, the AI program that embeds in the computer and can solve interviews for you. That's been plastered all over my feed.
Um, because it's controversial, but also like there were negative social consequences, um, to this. So we're finding a lot that these algorithms. Have unintentionally created this entire, you know, generation of influencers who are thinking about not how to be the best, but actually how to be the most controversial.
Mehmet: [00:20:00] Yeah. The, the other thing. Okay. You know, I think people would relate. I'm not against, like, I tell people, you know, if I say something it's not because I want to criticize. I believe it's freedom also. But one of the things which is again, it irritates me, Tim, like as humans, we try to break into the algorithm.
We try to find a way over the algorithm, and one of the things that I started to see, honestly, I tried it myself because I wanted to see the results, right? So the selfie, right? So the selfie drinking coffee or looking to the sea, you know, or the ocean, and. Or like these, and I've seen really, I've seen people doing like, really?
I said, did they look at the picture before they, they shared it? Mm-hmm. And of course there's this debate that start to spark. Yeah, link it in is not Instagram. Or you go to Instagram, hey, like this is a fun platform. [00:21:00] So the question here is. Are we trying to break the algorithms? Like of course you just mentioned this, but I mean, how much this is related, little bit.
I'm going a little bit psychological slash philosophical of us being too much into the impressions and the likes more than, you know, as a content creator, like in any field, you are to just try, try to put your thoughts there and leave it right. So. What do you think about this and how much is the addiction into the feeds playing role into this also as well?
Tim: Social media feeds, uh, as we call them, have certainly introduced, uh, this unpredictable wave of antisocial behavior, and that's the great irony, uh, of social media. Of the psychological aspect of it. What I started to uncover as I ran my [00:22:00] business and then later as I wrote my book, was that there's so, so many, there's so many things that people do, uh, that just don't make any sense.
So rather than being, uh, you know, guiding the algorithm by content and the algorithm learning from users nowadays, it often feels that the algorithm. Is guiding us and it's guiding us to places and it's guiding us down. These like patterns of behavior that I believe are anti-social things that are harmful and.
At an extreme, it leads people to both do things that are harmful to themselves in real life. So for example, there are these influencers who do things like, oh, okay, I'm not gonna drive a car fast to get more views. I'm going to jump over a moving car. And then when you have millions of children or young adults watching these videos.
There probably is some risk that a few of those kids are gonna try to [00:23:00] jump over a car. This was captured in my book, um, in the, the chapter Influencers verifying them, and it talks about who are the influencers and what are they doing to us, to not only our behaviors, but our perceptions of what it means to be, uh, notable or successful.
Many people these days are, you know, equating success with followers, um, or with metrics, but I make the argument that that's really not. How it works in the real world. I know many successful people who, who don't really even use social media or don't look at their follower account, um, or actually wanna keep things private and just for friends.
So what I addressed in that chapter was that, uh, when I was growing up, we were looking at, uh, you know, ba margera and, uh, the jackass brand of, of essentially stunt videos. And in my group of friends. If you're watching these videos, and these are the popular people, these are the cool guys who everyone wants to be like, uh, you are [00:24:00] more, there is a tendency to ride your skateboard without a helmet.
And I, you know, I make that, make that point. Like I had friends who broke bones because I. They rode a skateboard really fast down a hill without a helmet, you know, and by the same token, um, one of the earliest examples of harmful influencing or these clusters of internet behavior that I found actually had to do with, uh, homemade flame throwers.
So it was this tendency in like 27, 2007, 2008, where people were lighting Axe body spray on fire. And it was turned out that it was highly flammable and. I included this in my book because I actually had a screenshot from a video I made when I was a freshman in high school where me and my buddies went and we lit body spray on fire.
But there's a reason that you shouldn't ignite aerosol, um, because it could, it could explode if you do it inside, it could, it could burn the house down. It could cause life altering, uh, injuries. So I make the, you know, I make the [00:25:00] point that. Like with influence comes bad influence. And unfortunately the way that the algorithms have have kind of grown up and how the internet has kind of grown up, there's always been this tendency since jackass, since prior to the internet we've had these bad influences, but now it seems like the majority of influences are bad.
And our encouraging behavior that, you know, 20 years ago we would've looked at and said, why are the kids doing this
Mehmet: right now? If you want, you know, and. I think, you know, the title frame, uh, implies some, some blame, right? So if, if, if we want to blame what's happening now, so should we blame who us, ourselves, uh, should we blame the platforms or should we blame the people who, uh, you know, who are behind the curtains, who try to break the things?
Tim: I think the cop out answer, uh, is that we all have ourselves to blame to [00:26:00] some extent there nobody is forcing us, uh, to be on social media, but today that seems like a very, very weak, um, explanation or like justification. I could make the argument that you really can't be competitive in today's job market if you aren't on LinkedIn.
As you suggested earlier, LinkedIn is now more and more seeming just like a traditional social media feed with vertical videos on a carousel, which I can't understand how, how they allowed that. Uh, but the point is that, uh, social media is so intertwined with our lives that we do need to start thinking about this next wave of.
Uh, healthy usage habits and who is ultimately responsible? I think this is very, very difficult and controversial. Uh, we saw this in, you know, the US election cycles, especially a couple years ago where I. People were approaching companies like Meta and saying, well, there might have been this interference, there might have been some issues with, uh, how things were [00:27:00] done through traditional advertising channels and through advanced understanding of the algorithms.
And the American public essentially was subject to a propaganda campaign. We could look at these things and say, okay, sure, but who's deciding? Like what is right and wrong there? Like who is arbitrating between, what's propaganda and what's this and like ultimately, what's the difference between a foreign actor doing it and a state actor doing it, or an individual doing it and a private company doing it?
I appreciate the increased transparency there, but I think a lot of the, the, the blame has been misplaced there where it's just saying, um, at the end of the day, propaganda, uh, it's still up to the user to be able to fact check and reach their own conclusions. But if somebody spends all day on TikTok or on Facebook, uh, hoping for that rage bait, hoping to become emotionally invested in something.
Of course they're going to find it. There's an entire industry of people who are looking to manipulate, who are looking to sell [00:28:00] products. Um, and it creates this state where we really don't know who's to blame.
Mehmet: Tim, do you think still the majority of us are ignorant about this or do we know it and we just say, yeah, we know it, but so who cares?
What do you think?
Tim: Baseline. The level of addictiveness of the apps at this point, I feel is well known. And in the chapter screen grabbing, which originally was like 50 or 60 pages, it could have been a standalone book. I make the case of like, I. Well, parents at this point should really know that their kid shouldn't be scrolling or shouldn't be on TikTok.
Uh, but then I'm on the train here in New York and I see it's the main entertainment device, uh, for children. So I, I make, I, I make the case. That, you know, to give a kid a phone is to redefine childhood. And it is like, it, it is redefining these, these things. So [00:29:00] yes, there's a lot there. Um, and there's a lot of responsibility that's even at the local level.
You could find it at the school level and then of course like the government level. But at what point do we say, um, there should be less freedom? So if we were to say, okay, countrywide, everyone has to be off social media. Um, if they use it for more than six hours a day. In general, we could say that that's probably feasible.
Like that makes sense. Nobody should be there for more than six hours a day. But it starts to impede on what we think of, of, of freedom across most of the world. And it creates a lot of these very, very weird, um, like surveillance state, like, like 1984, like, um, environments. And we're not well prepared to deal with it, and we're clearly not well prepared to regulate it anyway.
Mehmet: Right. And I think, you know. This is my, again, my own, um, point of view on this. The problem is in time, it's more [00:30:00] the quality of, of what's in there, because it might take two scrolls to find a bad video, right? Or mm-hmm. A bad post. And you might stay scrolling for hours and eh, you know, like you see the.
Traditional things, as they call them, the CAD videos and so on. So the, the issue is like, how far, and I think maybe this is, you know, I'm not sure if we can call it like a black hat, uh, here, because I think. Your device knows a lot about you. So the social media providers, they know a lot about you. So I think it's, they're getting, you know, to the point they know what to show you at what time to show it to you, and then it's your choice.
And I think this is where our weaknesses, all of us, by the way, me included, you know, I try to hold myself, um, I try to at least open, if I [00:31:00] have to open something, like to open one. Uh, and only one because I've seen people like, who are like multitasking between Instagram, TikTok, LinkedIn X, uh, I, I, I've been there and it's horrible experience.
It's really horrible. But how we can, how we can, is there a way to change, like if I want to get straight to the point, is there a way to get out of this, uh, circle Tim.
Tim: The only way, uh, policy wise. Would start with how social media is regulated for young people. Um, I no longer consider myself a young person, but I generally agree with the guideline that nobody under 13 years of age should be on.
I. Social media. But then the question is, what does it mean to be on social media? Does that just mean that they can't create an account but they can still scroll all day? Uh, [00:32:00] so at the point where someone won't have awareness of the addictive nature, 'cause maybe, you know, in, in the closing to my book, I say most kids today have never known a world without, uh, pervasive technology.
Uh, the point is that by the time they could start to identify like, I hate this, this isn't how things should be. Um, it's many, many years later. It's where we think that there's probably some level of consumer, uh, protections involved. Uh, originally I kind of drew the, the comparison to smoking and it's like, well, you wouldn't give a baby a cigarette, but we are giving babies a.
Cell phones. So they are preoccupied and I think that that's really, really harmful. So, uh, people who are older and people who maybe grew up in a world where dial up was still, uh, an, an internet option, were more aware of the effects and maybe what we could do. But policy wise, it has to start with keeping, uh, kids off social media going in and saying we need [00:33:00] algorithmic transparency or something.
Uh, I make the case in my book. It would be so insanely expensive to, uh, force a company like Meta to unravel their newsfeed and their recommendation engine and say, show me exactly why you're recommending this, or like where the content's being sourced from. It would increase their costs by probably an order of magnitude, and in many cases.
It still wouldn't be explainable. Um, in plain English, especially things that are involving the more, um, like the deep learning. So content with many, many embeddings that don't have clear English descriptions. Uh, so yeah, overall I find it, uh, very, very difficult. Um, and as far as what can be done for people like me or you, uh.
Using screen time limiting apps is probably the best we can do. So an app that, uh, you can configure, uh, how often you can access a site, maybe you have a daily limit. Um, I found that very effective and I know that most of my friends do as [00:34:00] well. Uh, the extreme is to simply stop using social media. Um, I've been.
Forced off, uh, some platforms mainly because of, um, past, uh, you know, pa past transgressions. Uh, but there's also nothing saying that you can't delete the app off your phone or something along those lines.
Mehmet: Absolutely. Yeah. Like it's, but let's be, let's be realistic. It's, it's for some people I know it's hard, me included.
So for, for example, if you ask me, can you delete, link it in, because this is the main platform I use. Mm-hmm. No, I can, right. I, I need to check the feeds. It's, it's. It's not like addictive, but I mean, yeah, I, I need to at least open it two, three times a day and, and check what's in there. Um, if you want to compare him to.
The times where only few of these, uh, social medias were there. So a lot of people, they think that social media started it with Facebook, but actually, you know, uh, I was there. I'm, I'm old enough to remember [00:35:00] when something like, uh, France came and MySpace. So do you think like, at that time. Those guys, they didn't know actually to build a hook and make all these things.
So we had to wait till the Facebooks and the Instagrams and the other platforms to come. So all. The purpose really at that time was really, really to have what is called the social connection. Do you think, like this is what have also, you know, you know, drive us to, to go to that way and open doors for all kinds of, you know, explo, you know, exploiting, uh, these social media platforms.
Tim: Certainly when we think about how things evolved in early Web 2.0, so Friendster to MySpace is probably the most, uh, crucial transition and thinking about [00:36:00] the behaviors it kind of created or encouraged amongst, you know, early users and the features, uh, in the realm of what we now call persuasive technology.
How all of this came together and then was taken to the extreme on platforms like Instagram and I think especially Instagram. Um, I, I have a chapter in the book called Why Instagram and why I think it's the best platform to do this analysis is because it took a lot of the addictive aspects of MySpace, um, and a lot of the harmful social behaviors of saying, Hey, I want as many followers as possible, and then it gave it to you on your phone.
So really in there's, there's this tight coupling between there being more engineering hours dedicated to keeping users engaged and sticky, which wasn't really a point of MySpace at all. It just was by happenstance and then putting it on the cell phone. I can remember for years and years of my [00:37:00] life how much I looked forward to getting home from school so I could browse Reddit.
Or browse Facebook or browse MySpace. If we still had these natural delineations in life where you had to wait a few hours before logging in, uh, I think the world would be a better place. Uh, but it would also be a much more disconnected world. Uh, the introduction of the cell phone and being able to, uh, get that dopamine, just like with a thumb tap in your pocket, uh, essentially ruined everything.
So whether or not the engineers developed these systems as quickly as they did, um, now that is the main. Purpose, um, on a lot of teams in big tech is to increase engagement rates, which in some way if you break, break it down to like the human, the biological level, we are talking about, um, tricking the brain into producing more dopamine from these like traditional.
Reward passageways. And I will say for me, it's been 10 years or more since I really looked [00:38:00] at something and thought, oh wow, like I'm having a great experience using this. And it's because we're all tapped out and we're so burned out because it's been years and years and years of essentially being the lab rats, uh, for engineers behind persuasive technology systems.
Mehmet: Right now, to me, when I look sometimes on the content across all platforms, because. As you said, like almost they are becoming, or they are converging, in my opinion, to the same direction now.
Tim: Right.
Mehmet: Comparing to, not necessarily to, to things related to technology, but history shows that. When you have this kind of conversions, so people will get bored and they will go and try to find out something new.
So this is why some people talks about the dead internet theory, right? Like after a while people will will leave everything and I don't know, the next thing will come up. Some [00:39:00] people they say it's the Metaverse and the vr. I don't know. So how do you envision this, Tim?
Tim: The convergence was one of the original terms I used when writing some of these essays, and it's the thought that all social media, exactly as you said.
Starts to feel the exact same. So there used to be this battle between, oh, like which are you on, like for a brief time in 2008 it was, are you on MySpace? Are you on Facebook? We know who, who won that one? Now maybe we see that in the X versus blue sky debate. Uh, but because it's also grounded in like politics and personal beliefs, it doesn't have a ton to do with the actual.
Feature set, um, of, of, of a platform. So when I thought about convergence, it was along those lines for sure. In that everyone realizes short firm, short form, vertical video is probably the most compelling content that we can, that we can get. And that's why LinkedIn [00:40:00] added it, because even though it's not.
Really what I would expect on that platform. It's there. So as time goes further and further on, we've seen these, you know, conspiracy theories, like the dead internet theory, which is that if we reach this convergence, what about this idea of singularity and within like technological singularity, uh, it's not just saying that all the platforms are the same or it's just like.
A different, uh, a different storefront, a different, you know, uh, skin on it, uh, for the same internals. Uh, it's saying like, what is happening to the organic human element on the internet? And that's where AI has really come back to supercharge this dead internet theory of many years ago, so many years ago.
It was that, okay. Everything you see on the internet, there's no way to prove that it's actually real. It's just this government, uh, psyop where they acquired a couple different co, the CIA, acquired this company that is meant to manipulate people. And I'm sure [00:41:00] that's, you know, to an extent. True. Of course.
Like in, in every government, um, and every spy agency, anyone dealing with intelligence, they have very advanced technology behind the scenes that they're employing on their own citizens. And for measures such as counter-terrorism. However, uh, when we think about AI and the fact that AI can supplant, uh, actual human activity, then we start thinking, well, not only might the platforms be the same and these algorithms might be so convincing, but if we really have no way to know of who's doing what, who's to say that?
That content or that person now on the other side of the screen isn't, uh, computer generated. And that's like this really, really exciting procession, or I should say concerning procession where it sounds like science fiction, uh, but it's actually becoming more and more true, especially on Reddit, especially on text-based sites.
Mehmet: Right? So I think what people miss Tom, uh, Tim is, uh. [00:42:00] It's called social media, right? So exactly what you described. Actually, it was done for ages by traditional media, whether newspaper, later radios, later TVs, right? Even in the cinema, sometimes in the movies. So, but it was a one way communication. So social media, and this is why I think that the, the word media exists there because it's a media, right?
It's, it is a media, it's a full fledged media. And I think. The good thing in a sense for the platforms is that they know what you're doing. So they know what you're clicking. They know what you are liking. They know when you have stopped, you know, while scrolling and something attracted your attention. So now we have this real time feedback, which opens the door for all this.
Um. I'm not saying that social media [00:43:00] would stop to exist because to your point, like we'll be very disconnected and I think we get pretty much used to it. Mm-hmm. Uh, but I think at some stage people will stop following the glories of impressions and likes. And because at the end of the day, let's, let's be very frank, Tim, um, you do it.
For one day, one week, one month, one year. And then after that, what? And I started to see signs, and this is my own, it's not like a proper study, but I decided to follow some of the accounts on different platforms and see, you know, monitor them over time. And I see like, you know, I think these guys, although they get all the attention after a while because they weren't able to take it anywhere, they stopped.
Or maybe they are now shadow bad, I don't know. But sometime I go and check the account and nothing is there. So these guys, they just finished. So they just [00:44:00] popped up and they went, which is like cycle of life. Let's call it this way now. Um, if you, if you can like, maybe go back in time and fix one thing, you know, when it comes to, you know, building social medias, what, what would be.
The thing that you, you would change. So we are not in where we are today.
Tim: This is a huge topic and a weighty topic. Uh, at that. The cool thing about my experience, uh, in addition to having done the the black hat social media thing, was that I also worked at a social media startup. That put out an app that was live in the app store.
It had several thousand users and it was a niche social media app. So I was able to implement certain persuasive technology systems, like push notifications, um, the newsfeed, de deciding what content was there, who was shown to and why. Um, and also just things related to [00:45:00] recommendations. So I had this unique level of control, and that of course, led to some fairly advanced thoughts about what could be different, uh, on social media and in the.
And the evolution, uh, of the social media space. If there was something I could go back and change, uh, it's really, really hard. Um, it's really, really hard to go back and say that there's been like one thing. Um, even if I had this God-like superpower, if there was one thing I could go back and change, it would be this or it would be that.
I think today, knowing what I know, uh, and knowing how expensive, uh, it was for. Meta to, uh, stop a lot of the bot activity and a lot of the organic growth providers. For me, it probably comes to, um, more advanced regulations on use of like vulnerable people and use of people who are still like developing minds.
So at some point, I think to use social media on the, a similar plane. As we would, [00:46:00] um, like addictive chemical substances is probably the right way through this. Um, where it's not just about health and like the physical manifestation, but it's also thinking about, uh, mental health and problematic behaviors, um, and a great waste of human capital that comes over years and years and years of use of these platforms.
So I'm not saying that MySpace should have come with a warning label. Um. But as we pretty quickly saw on MySpace, and as a recount in my book, we should never have had people like me at 13 years old, even on MySpace. It just wasn't really right. And I just don't ever think we're going to figure it out for younger people.
So if there's one place where I think there's maybe some growing consensus, it's protecting, uh, the young and the vulnerable in, um, in, in the craziness of, uh, of the internet.
Mehmet: So maybe we need to spread more awareness, I would say.
Tim: Yeah. I think awareness is important and like, you know, it really starts at the atomic family level.
So if the parents, [00:47:00] um, don't care about their kid doing this or that, they're probably never gonna care about the kid using social media or scrolling on the phone all day. So it's a, it's a losing battle. Uh, I think, uh, but that's. One of the major takeaways from my book, or I hope it is for those who don't care about the business aspect, they don't care about the true crime aspect.
They don't care about the technical aspect, but they're thinking like, you know, one day maybe I'll be in a position where I could make change. In a lot of cases, it's taking, uh, a child's phone or a teenager's phone and, uh, drilling a hole through it. Throwing it out the window, uh, because I believe phones are actually that harmful and they facilitate, um, activity that is quite harmful in a way.
I'm the villain, I'm the product of that, but I also know tons of people who have essentially lived lives that are quite degraded because of, uh, overuse of the phone.
Mehmet: Yeah. And, and you know, I think there's, I'm not sure if it's true or not. Like I, I never. Fact check. This [00:48:00] is, it's not related to phone, but they.
I think I read it somewhere that Steve Jobs didn't, uh, want to allow his kids to have iPads. Mm-hmm. I'm not sure like, how, how true is this? I never fact checked it actually, because I think he was able to see, you know how it's not even in the sense complete even, but I think the addiction and, you know, the, the exposure that you would have once you have this, it's a mobile device, right?
So it's a device. You can have it with you anywhere you want. So the, from the moment you wake up to the moment you go and. Sleep bag, it's all with us. It's stuck with us. It's like part of our bodies, right? Mm-hmm. So I think this is, and not only for kids by the way, like I know like you're focusing more on the, you know, kids and, and teenagers, which is absolutely, you know, where we should be focusing on.
But I think I've thought even how it affects adults in a very negative way. In a very, very negative way, and I advise people because there is this practice where you go on a weekend and you [00:49:00] don't bring anything of your devices. No laptops, no mobile phones, no tablets, nothing. I tried it couple of times and guys, I can't tell you how much this is, uh, I would say beneficial for your mental health.
So. It's a, it's, it's a good reminder for us also. Tim. Tim, uh, any final words you want to share with us and where people can get in touch and find about the book?
Tim: I definitely appreciate it, and I think we're on the same page as far as what can be done to, uh, better understand and then be a, a better, more responsible user of social media as well.
Anyone who's more interested in, in these topics, uh, and more, um, can find it in my book, which is framed a villain perspective on social media. Uh, it's now, uh, an ebook form also in paperback shipping worldwide, uh, on Amazon. Anyone looking to find out more about me and follow me more on the week over week [00:50:00] basis rather than waiting until I publish another book.
Um, you can find it on my, my blog, which is, uh, tj O'Hern dot com or tim O'Hern dot beehive.com, uh, for my newsletter.
Mehmet: Great. I will make sure that I'll put all the. Links in the show notes so people, they don't have to go and find Tim, really, I enjoyed the conversation. It's a very important topic, uh, especially for the new, new generation.
Uh, and I hope, you know, people will, will the adults among us, so also they will take care, uh, because this is sensitive topic and you know, knowing the secrets behind, especially, uh, for someone who was on the other side and now you're exposing all these things. It's very important. For everyone. So it's not like, uh, just a, um, some people it's for everyone, as I was saying to, to have, so I advise everyone to go and grab the book.
Uh, I would do myself, so, uh, and again, I put also the link to the, uh, to Amazon [00:51:00] so they can, I think it's on Amazon, right? So they can go get it there, Tim again. I can't, uh, thank you enough for sharing your insights and you know, your experience and of course the thoughts, uh, in the book. And, um, you know, this is how I usually end my podcast episodes.
This is for the audience guys, if you just discovered us by luck. Thank you for passing by. I hope the social media algorithm. Will push us and you'll see, you know, the snippets that usually I share. If you, uh, like it, please subscribe and share with your friends and colleagues and if you are one of the people who keep coming again and again, thank you very much for all your support.
Thank you for making us ranking top 200 charts in multiple countries on Apple Podcasts this year. Thank you also for choosing us as you know. Top 40 must listen podcasts in multiple countries now. Just this morning, 15th, uh, April, I get to know that [00:52:00] we were chosen to be on the top a hundred future technology podcasts, uh, globally.
Thank you all for the support. This cannot be done without you. And as I say, always stay tuned for a new episode very soon. Thank you. Bye-bye.