
Does this sound familiar? You tell your teenager to be careful on social media, but you're not seeing any improvement in their behavior or safety. It can be frustrating to feel like you're not making a difference in your child's online life, especially when there are real dangers lurking on social media. But simply telling your teen to be careful isn't enough. In fact, it can even backfire, leading to resentment and rebellion. Don't let the fear and confusion of social media use overwhelm you. With the right tools and guidance, you can help your teenager navigate the online world safely and responsibly.In this episode, you will be able to:
Examine the challenges of social media consumption for minors.
Investigate the heated debate regarding TikTok and its potential hazards.
Understand the significance of privacy and personal choice for teens on social media.
Discuss the effectiveness of content policing on social networking platforms.
Find out ways through which teenagers can practice mindful social media usage.
Join Podpage: https://www.podpage.com/?via=radell
Join Podmatch: https://www.joinpodmatch.com/purplepoliticalbreakdown
My special guest is Chris MacKenzieChris MacKenzie is an experienced communications director who has dedicated his career to advocating for a technologically progressive future. With a background in working for Democratic campaigns and lawmakers in red states and rural communities, Chris has a unique perspective on the potential impact of social media use among minors. As a key figure in the Chamber of Progress, he has extensive knowledge on related policies and the intersection of technology and progressive values. His insights are invaluable for parents aiming to ensure their teenagers' responsible use of social media.The resources mentioned in this episode are:
Visit Chamber of Progress website (www.chamberofprogress.org) to learn more about their work promoting technology's progressive future.
Support The Purple Political Breakdown podcast by sharing and leaving a review.
Monitor your child's social media use and be involved in their life.
Use parental control settings to limit your child's access to certain apps or content.
Consider using apps like Bark or Qustodio to monitor your child's online activity.
Educate yourself and your child on digital safety and responsible social media use.
Encourage your child to report any inappropriate or concerning behavior they come across on social media.
Advocate for national legislation on social media use for teenagers and children to protect their privacy and well-being.
Contact your elected representatives to voice your support for such legislation.
Consider supporting organizations like Common Sense Media or the National Center for Missing and Exploited Children that work to promote online safety for children and teenagers.
This podcast uses the following third-party services for analysis:
Chartable - https://chartable.com/privacy
Podcorn - https://podcorn.com/privacy
00:00:00 Just this year, there have been reports of journalists having their data exposed in China. There have been reports of TikTok creators having their financial data sent to China. When Texas passes a law that would prohibit social media platforms from posting teens on a site where they can see, quote, unquote, groom content. What I see is an opportunity for Texas's Attorney General to sue major social media platforms to take down LGBTQ content. Pay attention to how they're spending their time on social media, and be involved in their life. 00:00:41 Do not let the Internet raid your kids. Welcome to the Purple Political Breakdown. I am glad you are here and I'm glad you are listening to today's podcast episode. My mission in each and every one of these episodes is to really focus on the solutions to some of the biggest questions and most controversial topics going on in our current society. I feel like most of these conversations are not truly being discussed in a more logical and respectful manner due to the political toxicity that goes on with both the left and the right, both the Democrats and the Republicans in this podcast. 00:01:24 I don't care about any of that. I am focused on the solutions. I'm focused on bridging gaps. If you want to join me on this journey, if you want to discuss some of the most important topics, if you are tired of the political toxicity and negativity from both sides, please support this channel, share the podcast and go to my website, www.purplepoliticalbreakdown.com. I appreciate the support. 00:01:54 I'll continue to make content and hopefully we can start bridging these gaps and focusing on real issues going on in our world. Welcome back to The Purple Political Breakdown. I'm your host, Radell Lewis, and today. We'Re doing episode number 25 as we're going to talk about should we have national legislation in reference to social media use for teens, for kids, for toddlers, et cetera, et cetera. I'm going to have that conversation today with my guest. 00:02:23 His name is Chris, and we're going to really dive into the nitty gritty on the harms of social media and whether this is even something we should do as a federal government before we dive into the topic of choice. Of course, I'm going to start off by listening a review that has been dropped on my podcast. If you want to be shouted out for a future episode, leave a review so I can say your thoughts on a future podcast episode. Today. I'm talking from or. 00:02:52 I'm reading a review off of from the Fallible Man. He said it was refreshing to find. A fairly neutral and open political conversation, actual conversation, real discussion. Thanks for that. Appreciate you. 00:03:08 That's what we try to do here. That's what this platform is for. That's what these shows are for. That's what these podcast episodes are for. To have actual conversations, come up with actual solutions and kind of forget about all that political bias and toxicity and really try to dive into things a lot more. 00:03:26 I would say logical in my personal opinion, because I feel like there's plenty of solutions we can dive into. But for some reason, because of this political toxicity and political bias, we never really are able to get to them properly. Right. Especially with the implementation of social media and how it really kind of warps society in a very unique way. And because how people interact with, well, one, the person to person connection kind of changes how people interact with other people, they kind of lose those social skills. 00:04:05 They kind of feed into the extreme and the negativity and the toxicity. But it seems my guest is here, so let's hop in. So I want to kind of start out by letting you introduce yourself, tell what you're about and how you're kind of involved with this type of discussion topic we're talking about today. Sure. My name is Chris MacKenzie. 00:04:31 I'm the communications director at Chamber of Progress. I've spent most of the last decade working for Democratic campaigns, actually, and lawmakers in red states and rural communities. So from Alabama to Oklahoma to western Maryland, all very purple areas. I've been at Chamber of Progress pretty much since our organization was founded two years ago. And Chamber of Progress is a center left tech industry policy coalition promoting technology's progressive future. 00:05:04 We work to ensure that all Americans benefit from technological leaps. Our partners include tech companies like Amazon, Apple, Meta, Google, and dozens of others, but our partners don't have a vote or a veto over our positions. Okay. Very interesting how you're kind of diving into the technological aspect and try to show the importance it will have moving forward in terms of society as a whole. I'm curious because you did say center left on when you're referencing it, because I never really thought about the political spectrum for the advanced technology. 00:05:51 So is there like an opposite side where there is the same kind of process, but they do it for more center right or right leaning political agendas? Or is this mostly a thing you see for the center left or the left? First of all, I would say this is a sign that I've probably worked in DC too long, that I sometimes see policy from a partisan angle, and I can always see the left leaning perspective here. But I would say industry friendly groups are actually traditionally the arena of the political right. You have groups like Chamber of Commerce that have for more than 100 years kind of represented the interests of conservative business. 00:06:39 Right. So we're a new industry coalition, and we're looking at how can we advocate for pro industry policies that also align with progressive values such as equality. Okay, I definitely get what you mean when it comes to usually when you reference conservatives or corporations. You may think on the right in terms of industry, but it does seem like nowadays they're definitely leaning more on the left and for more corporations, especially with a lot of things that I look over for something with the ESG, something with George Soros or something, the Cai scores. So it's very interesting to see corporations more and more popping up that's more left leaning because the tide is really shift where back in the day the right was like, oh, big business, big business. 00:07:39 But a lot of them are like, now we hate all big business and now all of them are left winning. So do you think that the tide will eventually shift back toward the right or do you think it will just continuously moving more and more to the left? Sure, I disagree with the premise. I actually think that most of big business probably still leans conservative. Right. 00:07:59 It's still very anti regulatory. They don't want to be regulated by the government. And Democrats are generally more pro regulation than Republicans are. In some areas, like ESG, there's some alignment between corporate interests and Democratic interests. And I think what's gotten Republicans angry about this is that, again, the private sector has traditionally been an area of support for Republicans. 00:08:29 So seeing the Chamber of Commerce endorse the first few Democrats during one of the last congressional election cycles made a lot of people really mad. Right. But I think we've seen kind of a breakdown in, again, that single party support from industry for Republicans and now there's actually a little bit of a split. Right. Again, on some issues like ESG, there's some more alignment with Democratic organizations and politicians than with Republicans. 00:09:00 But I don't think it's one way or the other right now. Okay, fair enough, fair enough. So before we dive into the topic at hand and really dive into social media, I always kind of go over some interesting news that is going on, especially relating to the topic at hand. And one thing that has been going on is reference to TikTok. For example, there's been a lot of criticism from TikTok, from a lot of mostly Republicans and people on the right. 00:09:31 Matter of fact, Montana wants to ban TikTok altogether and TikTok, in response doesn't like that. So they're trying to sue Montana, of course. So I'm curious on your thoughts because you're, like you said, more center left. What do you think about Montana? Or what do you think about any states trying to ban a social media platform even though technically it originated from a country that's not the United States? 00:09:55 Sure. Well, I think to answer your question, really succinctly, I don't think it is within the authority of a state to ban TikTok, but the federal government, on the other hand, does have that authority. So to zoom out on your question a little bit, the idea of a TikTok ban, I think is one of the sexiest issues in tech policy today. Right. It's really caught the attention of a lot of the news, and we have a really unsexy position on that, which is that it's all in the nuance. 00:10:32 Right. I think that TikTok poses a very credible national security risk. And our organization has taken a stance that Biden should conduct a thorough review through Cypheus, the Committee on Foreign Investment in the United States. And if that assessment reveals serious national security risks, they should force TikTok's parent company, Bitdance, to divest from TikTok. Right? 00:11:02 Of course, you can only force that divestment if you follow it up with a credible threat to actually ban TikTok if they don't divest. I feel like that was a very complex answer to your question, so let me know if there's a part of that that I can kind of reach in and clarify. Well, one thing that I feel like people should understand, because you mentioned if it possesses a national security risk, that's a big part on people wanting or not wanting to ban TikTok. At the end of the day, what national security risk do you really think TikTok can have for the United States? This is a great question. 00:11:43 So I think there are two types of risks that TikTok, as it's currently owned with its Chinese parent company, poses to the United States. One is a privacy risk. It is not just that they have our personal data, these all sorts of apps have our data, but it's that the Chinese Communist Party has access to that data through Bike Dance, which, although it says it's storing data in the United States, there is a long and storied history there of that data actually making its way to China. At the end of the day. The second risk that TikTok poses to us is one of content moderation. 00:12:26 Right. The TikTok app answers to its Chinese parent company and its Chinese parent company answers to the Chinese Communist Party. And so at the end of the day, who's making those content moderation decisions about what posts get taken down, what posts get promoted? There, again, is a history there of TikTok taking down content related to, say, ugar, Muslims from China who have faced a long history of discrimination in that country. So those are two big concerns, and TikTok has repeatedly claimed that those are not concerns. 00:13:07 But again, we've seen over the course of the last four or five years, this resurface again and again and again just this year, there have been reports of journalists having their data exposed in China. There have been reports of TikTok creators having their financial data sent to China. So we hear this story resurface maybe every couple of weeks. And to me, I would describe this similar to, let's say you're hiking and you get a sprained ankle, right? You have all the symptoms of having a sprained ankle. 00:13:46 It's swollen, you can't walk on it, et cetera. And you know, you need physical therapy for it. But in order to get physical therapy, you're going to have to go to a doctor and get that official diagnosis and get referred. Right. We have all the symptoms of TikTok presenting serious privacy and content moderation risks, but we need to go through the CFIUS process and take an official look at that, really get a diagnosis before Biden can take executive action in a legal, constitutional way. 00:14:17 Because at the end of the day, there are some First Amendment concerns. If you're banning an app from the United States, the First Amendment doesn't just apply to the speech that we say. It also applies to the speech that we hear. That the speech that we can read. Right. 00:14:34 So if you are going to abridge that right in any way, there needs to be a credible look at the national security threat that this app poses. Yeah, I definitely understand a lot of the things you're saying. 00:14:55 I think most citizens recognize, okay, yeah. We don't want another country to have this type of information. If it's the United States government, I'll be upset, but it's my own country. But then we know that there are a lot of people who really just don't like the United States even though they live here. So they're really not going to buy that excuse until something goes on. 00:15:12 Right. So it's just one of those unfortunate things. And that's why we need, like you said, we need to kind of dive into it more and more until we're actually officially sure. Yeah, this is definitely a problem. Let's take action. 00:15:25 Of course. And I think, well, this is actually going to go hand in hand with the conversation we're about to talk about in reference to the First Amendment, in reference to the risk it may pose for not only teenagers, but everybody involved. Were you about to say something before we move on to the topic? Yeah, I could talk about TikTok all day, but one of the points that I want to make here is that the need to look into whether this presents a very real and incredible national security threat is one of the reasons that Montana's ban probably won't stand up in court. Right. 00:16:04 It is within the jurisdiction of the federal government to conduct that assessment about the threat that TikTok poses to national security. Montana hasn't done that. Montana is not doing that. Right. 00:16:18 We're watching the challenge closely, and I wouldn't be surprised to see that law struck down. Yeah, me neither, to be honest. All right, with that said, like I said earlier in the introduction, where do we talk about social media and its usage in reference to, more specifically, teenagers and younger? So let's start it off with having you, Chris, kind of indicate what your position is and what you're talking about. Sure. 00:16:46 I mean, I'll start with an overview of what we see happening in the social media regulatory landscape right now. I would say there's a major battle going on in state legislatures across the country about teen access to social media. From a 20,000 foot view, our stance is pretty simple. We oppose legislation that would endanger the privacy of online users and isolate teens from needed online support. We want to make sure that young adults can connect with peers online who identify like they do and that teens have access to helpful information and resources on the Internet. 00:17:24 All right, and when you say teens, I think this is always important to dictate because sometimes people put teens as there. I'm like, okay, are you talking about a 13 year old or are you talking about an 18 year old? So are you referring to everybody below 20 or like, everybody below 18? Basically? Kind of this is a really good question here. 00:17:45 Stop right there. Yes, this is a little mini ad. Don't skip, don't skip. All I want to tell you right now is that at the end of. The day, when it comes down to. 00:17:54 All the discussions I want to have. I want to be able to communicate with you, the audience. I want to be able to relay a message and receive a message from everyone and try to come up with these great solutions that I keep on talking about. So if you want to be part of the community, make sure you go to the website and sign up for not only the email list so you can get weekly emails from me for the podcast episode, informational sessions, all that great stuff. But also sign up to go on my discord so you could be part of the discussions. 00:18:27 Debates on my live streams. So be sure to go to the website www.purplepoliticalbreakdown.com and go to the email list, sign up and go to the discord and join the server. Now back to the episode, because a. Lot of the social media laws that we're seeing propagate in these state legislatures and in some cases pass, don't distinguish between a 17 year old and a ten year old, right? Which is there's a very big difference there in what those groups should be able to access online generally. 00:19:01 But when I say teens, I would say I mean 13 year olds to 17 year olds. So minors, legally minors that we're referring to, that's most of who these laws would apply to, because largely children under the age of 13 are not allowed on social media platforms currently. They aren't allowed. They're not supposed to be allowed. They're certainly not supposed to be allowed on social media podcasts, but these laws would apply additional restrictions to 13 to 17 year olds who find themselves on social media. 00:19:38 Okay, I like to do this exercise because I think this kind of puts things in perspective. I always do this for every person that comes involved, especially when there's a topic of discussion that's being discussed. So can you elicit before you start listening to the laws you're referring to, can you elicit the pros and cons of social media usage for teenagers. Yeah, sure. I think that I'll frame this up by saying that last week, the Surgeon General came out with a new report about some of the pros and cons of the use of social media for children and teens. 00:20:23 Right. One of the pros that we see is that most teens and most children who use social media report having a positive experience, report developing connections with friends and having an outlet for creativity and to express themselves. We also see higher benefits of using social media for some groups over others. Right. LGBTQ teens, for instance, report actually higher mental health benefits for using social media than just the general population of teens. 00:21:03 One of the reasons is because they're able to connect with people who identify like themselves online and find supportive communities who can help them through what they're going through. Right. There are, of course, some negatives to problematic social media use. Teens have reported, obviously, some mental health problems from too much social media use. And spending too much time online can also take teens away from their family and take teens away from engaging in real world activities. 00:21:38 So there are drawbacks as well. But on the whole, again, we see teens reporting a positive experience with social media. Okay. All right. With that said, what do you have in mind? 00:21:54 Referencing the laws and social media. Can you dive into that? Yeah. So we've seen a wave of state and federal legislation pop up this year related to teen use of social media. Right. 00:22:06 And I would say one of the first laws that we saw was in Utah. Utah proposed a parental control law that would require parents to sign off on their children creating a social media account in their state. And it would also require social media platforms to verify the age of any user using their site. Right. That presents two major problems. 00:22:35 Right. One is a privacy issue. And this is an issue that we see again and again with any law that requires social media platforms or online platforms to apply different standards to young users than to old users. These laws require age verification and to verify the age of a user. In order to protect a company from liability, they need to collect personal identifying information. 00:23:05 Right. So you got to show your ID to get in, right? Yeah. That's one thing if you're walking into a convenience store and buying, let's say, an adult magazine or a lottery ticket, right. Because those convenience stores, for the most part, the clerk will check it out, give it back to you, and that's it. 00:23:24 Your information is gone. If you are entering your personal identifying information at the doorstep of every website you go to, that creates a lot of privacy risks for you. You're exposing your personal information online in a lot of different places. Right. And so that's going to expose you to a lot of cyber risks, and that's not just for children. 00:23:48 The adults are exposed to the same risk. Because if you're logging into social media, the social media site is not going to know if you're a 14 year old or a 41 year old. So if I'm logging into social media, they're going to need an ID from me just as much as they're going to need an ID from a teen user. So everybody sees those privacy risks when online platforms are required to offer separate services and verify the age of young users. Right. 00:24:17 So that's problem number one. Problem number two is that when you create parental consent requirements, you assume that all teens have supportive parents that keep the best interests of their children in mind. Let's just take the state of Utah, which was, again, the first state to sort of pass this parental control legislation. In Utah, there's something like more than 10,000 cases of confirmed child abuse every year. Right. 00:24:50 Those are children who have a real need to access online communities and find support online, who are then cut off from that access by a parent who is abusive or doesn't keep their best interests in mind. Right. But it's not just teens who live in abusive households. It's LGBTQ teens are another fantastic example. Right. 00:25:15 Again, this is a community that actually sees more benefits from social media than even your average team. But a lot of LGBTQ teens don't live in a household with a parent who recognizes their gender identity or who supports how they identify. Right. So how will they create an online social media profile and connect with peers who identify like themselves or connect with even online information to learn about what they're going through? Right. 00:25:48 That's going to become harder. Basically, these laws isolate teens by giving parents the keys to a teenager's social media account. And again, they don't treat teens who are, say, 17 years old any differently than they treat a child who's ten years old. Right. And we all know that a 17 year old requires a little bit more independence and a little bit more autonomy than, say, a ten year old. 00:26:16 Utah's social media law actually goes far enough that it requires parents to have access to a teen's private messages to their DMs. Right. In some cases, especially for a very young user, that makes sense as being something that would be important. But there's a level of privacy that teens also expect today when they're communicating online or when they're communicating in person. Right. 00:26:40 And especially for some teams teens, again, who don't have supportive parents, there's a real need for that privacy. Okay, so I want to kind of dive into this a little bit, because inherently, I don't know where I stand in terms of parents and their autonomy over their own kids when it comes to social media as of yet. But I'm kind of thinking it through. So the first thing that I want to kind of play devil's advocate about is you referenced the identification thing. I do think in terms of providing identification will be tough in the first place for kids specifically, until you're like 16, you're not going to have a driver's license 15 1413. 00:27:23 Most teens do not have like a state ID. I don't really know any teens that have that in the first place. So unless you're putting like, I don't know your birth certificate online, then I don't know how they're going to identify that. So that definitely will be a very tough thing to identify in terms of putting documented identification online. But with that said, do you believe that this documented identification they put online in terms of the risk factor is any more risky than any other personal information you put online in the first place? 00:27:58 Like when you go on social media and you kind of talk and kind of talk about who you are and everything, especially when it comes down to you're not using the proper kind of two factor authentication and you have a very easy password. Like, I'm just going to use password as my password because I'm 15 and I don't know better. Do you really think that kind of next step is going to really change that much in reference to the risk factor that they're going to go through in the first place? That's a really good question. And I would say first of all, yes, there is a big difference in the information that I choose to upload when I'm creating a social media account. 00:28:44 I'm not putting my driver's license number online. I'm not putting my Social Security number on Facebook, right? I have my first and my last name and the city I live in. But beyond that, I'm not putting my contact information in there oftentimes, right? And so there is certainly more personal identifying information that social media and online platforms could be required to access in order to legally verify that you are above a certain age so that they can avoid being sued by the state, right. 00:29:18 They're going to have to go beyond collecting your email address in the city you live in. And the second thing I would say is that, yes, some users might choose to upload more information about themselves, right, more pictures with their friends or something that say some sort of a hacker could identify information about them from a picture and use that to commit identity theft or something like that, right? But there is no choice that users are being given to avoid uploading that sort of information with these kind of age verification laws in place. You want to access email, show your personal identity information, you want to access Facebook, you want to access Twitter, you want to access YouTube, any of these major online social media platforms, you have to show your ID. And that's not a choice that a user can make, right? 00:30:16 Whereas I choose to upload a certain photo to Facebook, I don't have that choice. If there's an age verification law in place about showing my ID. Okay. Yeah. Overall, when I'm thinking through it, I will say that when it comes down to it, the expectation for a teenager to put their ID or birth certificate online and then an adult, I'll never do that. 00:30:41 It's not just a teenager. It would be adults that would be required to show their ID as well. Right, because before you show your ID, how's the social media platform to know if you're a teen, who they're supposed to be Iding, or if you're like us. Right, so it's everyone. It's everyone who these privacy risks would apply to. 00:31:00 Right, well, I think when you approach it that way, especially since adults are the ones that are going to make the decisions, not these teenagers in the first place, I think most of them would be not in favor of just putting their personal information just to sign up for a social media account. I do want to clarify something when you're referring to social media because there's a lot of things that fall under the moniker of social media. Right. YouTube falls under the moniker of social media. So are we referring to mostly, like, the more communicative social media platforms like a Twitter or Facebook or Instagram, for. 00:31:34 Example, or are we talking about everything? It depends on which law you're talking about. Oftentimes these laws have a requirement of a certain amount of monthly active users for any platform that they apply to. In some cases, they explicitly exclude email. 00:31:56 Each law has its own set of requirements about what platforms it applies to. Right, but one of the issues as we see these laws pop up in a number of different states is that it is very difficult for online platforms to assess with certainty which state a user is coming from. So sometimes when a state enacts internet restrictions that require, say, a show identification or it eliminates my ability to see a certain type of content online regardless of whether I live in Utah or not, that restriction can end up applying to me downstream. That's one of the issues that we see with when you log on to a website. You get this pop up that asks you whether you want to accept cookies or deny cookies. 00:32:51 Some of those restrictions were first actually enacted in the European Union, but because of the way that the Internet works without borders, we end up seeing those restrictions pass in one state and then they end up applying to the entire country as a whole. Right. Because rather than trying to discern which state I'm coming from, these internet platforms will just apply them to any US. User. Okay, understood. 00:33:20 Understood. The second part that you mentioned, I think it was referencing parents access to the social media accounts of the teenagers themselves. Now it isolates teens. From accessing content and connections that can be important to them. Right. 00:33:39 So my thing about that is a little bit different, I would say, because when it comes down to it, I mean, obviously when you're referencing abusive households, the social media platform, basically anything that is not dealing with the home is an escape from what you're going through in reality. But when you're thinking about the job of the parents, the responsibility of the parents, one big thing that I am always an advocate for, especially when it comes to social media, because I personally believe that my generation and generations around me, relatively. The ones that kind of grew up where social media really wasn't a thing. And then we're growing up, and now social media really is a thing. We kind of understand the nuances of non social media versus social media. 00:34:27 So my big thing that I always try to tell people is now you're the parent, you understand how harmful social media can be. So you need to start parenting moderate that a my four year old probably shouldn't be on the tablet all the time. A my twelve year old probably should actually interact with people, not interact with the Internet all the time. So my thing in terms of parental responsibility is that they should be very much involved with what their kids are doing, what their teenagers are doing, kind of similar to having the location automatically inputted on your phone, knowing where your kid is at all times just in case something crazy happens. Right. 00:35:06 So in reference to that, in terms of individual parent responsibility, how do you kind of argue or kind of explain the other side to a parent who says, I want to be involved, what my kid is doing, I want to make sure they're not doing anything crazy. They don't know better anyway. They're a child. They should. Absolutely. 00:35:27 I'm in full support of parents who want to monitor how much their kids are using devices and using social media services. And there are a lot of safety controls that today's larger platforms like Facebook actually do offer to parents whose children are on social media so they can be notified when their child goes over a certain time limit or when their child is interacting with, say, risky people online. Right. So some of those controls do exist on these larger social media platforms already. But eliminating the option for privacy, especially for older teens, gets into an area where you really are isolating some kids from, again, peers and information and resources that it's helpful for them to interact with. 00:36:23 And while they agree that it could be somewhat helpful for, let's say, 70% of parents to have access to even greater control over their kids social media, there's 30% of households where this could be actively harmful. Right. And that's kind of what we're speaking out about isn't just the average everyday parent who says, well, I want to monitor my kids social media use. That makes sense. And listen, I'm not a parent, but I do recognize that that's got to be really difficult in today's day and age. 00:36:59 I'm sure it has to be. At the same time, we do have to look out for laws that could harm marginalized teens, teens living in a household without a parent that looks out for them with an abusive parent or the parent who doesn't recognize their gender identity, right. Some of these laws. So Utah's law, again, was one of the first laws to pass that provided these parental controls to households. Texas has followed that up with their own law that also provides a form of parental control. 00:37:37 Right. It says that for any known minor who's on social media, a social media platform has to provide their parent with basically full control over their account so a parent could shut the account down or disactivate the account. So rather than like it being a sign off to create the account, a parent always has to be in control of a nominer's account under this new Texas law. But the Texas law also goes farther than just providing parental control, and it starts getting into a creates, actually, a band list of content for teen users. Right. 00:38:19 And some of that content does make some sense. Right. We don't want teen users seeing, say, self harm content on the Internet. We don't want teen users seeing eating disorder content on the Internet. But the band list also includes some questionable stuff. 00:38:37 Right. It includes a reference to, quote, unquote grooming content. Right. Okay. I'm not sure how familiar you are with the book bans that are passing in conservative states across the country targeting specifically targeting LGBTQ literature, right. 00:38:57 And the librarians who've been prosecuted and persecuted under these laws. Right. But they're often persecuted under the guise of, quote, unquote, grooming young kids when they're, in fact, librarians offering teens access to LGBTQ resources. Right. Okay. 00:39:18 So I see some of these new social media laws, parental control laws, as almost an extension of some of these book bans in conservative states because they serve the same purpose. They provide conservative parents, especially more control as our country works to recognize LGBTQ people with more equal rights. Right. It provides them with tools in what they view as a culture war and what I view at times as a war, often on LGBTQ people. Right. 00:40:01 So when Texas passes a law that would prohibit social media platforms from hosting teens on a site where they can see, quote, unquote, grooming content, what I see is an opportunity for Texas's Attorney General to sue major social media platforms to take down LGBTQ content that teens might be able to see. And there's no reason to believe that won't happen when we already see conservative lawmakers suing to take books out of libraries that provide resources to these kids. Right. So I see it as all kind of part of the same movement in some sense. Yeah, I definitely understand what you're saying. 00:40:44 Before I want to tackle those two prongs, I have a question. So my question is, if you know, obviously, how is a state like Texas going to enforce these bans on specific content, social media, in specific states? How do they enforce that in specific states? I'm not sure. What do you mean in specific states? 00:41:11 Because obviously this ban in reference to Texas is specifically Texas, but the Internet. Is not specifically Texas. Exactly. So the moment I cross border lines, I'm like, okay, does my social media just change automatically to now, give me the access to the content? I'm curious, how do they even enforce this specific ban? 00:41:33 Well, the way they enforce it is that the Attorney General, there's no individual right of action, so it wouldn't be parents suing under this law. But I believe the Attorney General would sue a social media platform if a Texas team is able to see that content on their ban list. Right. And so for social media platforms who can have trouble distinguishing between a Texas user and a Utah user, it's very difficult serving them a different set of content when you cross the state line, as you said, the thing that they would probably be forced to do to avoid liability entirely is to take down any content that could make them liable. So if you are a lawyer at a social media platform, the thing you do to avoid getting sued by the Texas AG is you self censor and you remove LGBTQ content is probably what they would be forced to do to avoid being sued under the law. 00:42:37 Right. Which would again be extremely harmful, I think, to teens. Okay. Yeah. So that's what I assumed. 00:42:43 And one thing I want to tackle the content moderation thing first before I tackle the other part is I will never agree, especially when it comes to social media, when the government, state or federal, starts implementing, this is the content you're allowed to see, and this is the content you're not allowed to see. Because once you start toeing that line, that's when it gets into problematic territory. Because their kind of decisions based on what content they deem acceptable, obviously. I feel like mostly in terms of society, we kind of know certain lines that we shouldn't cross. Like we're not going to put on TV someone brutally getting killed and all this crazy. 00:43:30 We kind of know these lines, but when they're blurry, like an LGBT situation where it's obviously gray, but obviously conservatives or Republicans may say, no, this is like Satan's work or whatever. I mean, to me, that's not blurry at all. To me, it's right, First Amendment right to post LGBTQ content online and information. Right. Right. 00:43:52 I'm saying it's blurry in the sense that society as a whole doesn't have a collective agreement on it. But I think society as a whole has an agreement on some torture thing happening online right. Or self harm content. Yeah, self harm, like you said, I think society as a whole knows, hey, we probably shouldn't be showing people self harm. This is an interesting point, and if I can jump in here, I think a lot of people, when they think about content moderation, they really think that these decisions are black and white and that a lot of the content that social media sites encounter is this black and white. 00:44:31 Like, well, it's someone murdering someone else. We've got to take it down. Right. There's a lot more gray area than I think your average user necessarily recognizes. And I'll give you an example, right. 00:44:43 Congress passed a law several years ago called Cesta Foster that cracked down on trafficking content online. Right. So sex trafficking content, that seems like it could be helpful, right? Because you don't want people being trafficked against their wills on the Internet. Unfortunately, what that did was that it forced platforms, Internet platforms, to overcensor for sex trafficking content. 00:45:15 And it took some people who are in the profession practicing sex trafficking of their own volition on the Internet safely, and it forced them out into the streets right. Because they were no longer able to access Internet platforms. So it had repercussions far beyond what Congress envisioned. And again, effectively what you do when you create a ban list, when you ban a specific type of content online, internet platforms will almost always overcensor to avoid getting sued. Right. 00:45:50 They're going to take more content down than they necessarily have to because they don't want to face the lawsuit at the end of the day. Okay, what do you mean by sex trafficking? I did not understand what you meant by it when you said that. And then they were put on the streets. What did you mean by that? 00:46:08 Sorry, I said sex trafficking. I did not mean sex safe sex trafficking. I meant sex workers. Sex workers. Okay. 00:46:15 That's why I assume sex workers work. I just had to clarify that. Okay. So, yeah, I understand. So basically you indicate sex trafficking obviously bad, they over penalized, and then sex workers who did it safely were also penalized because of it. 00:46:34 Yeah, I understand that. I definitely could think that's an issue, but I will say that's something that probably would have to be worked out afterwards, where these sex workers would have to make their point distinction in terms of what I'm doing is not the same as what's going on here. I'm just talking about the pros. I'm like pros and cons for this. Pros, more of this discussing content off con. 00:47:04 This happens, but then we have to work on that, take the next step. I'm trying to find a solution for the people who are affected by what happened next. Right. So I definitely think that's one of the things, because obviously these companies don't want to get sued and that's something you will never be able to control because these companies will first and foremost look after themselves. They're going to take care of their own self interest before anything else. 00:47:28 At the end of the day, they're company. Exactly. So in response, in reference to the content moderation point that I was making, it's always going to be iffy because the moment you give them the power, the moment you kind of give them the inch, they may take that mile where these stage going like, okay, now I don't like this. I'm going to try to get this bad. Maybe something will backfire for Texas. 00:47:56 Texas wants to be all conservative and go like, okay, all this LGBTQ stuff off. Then we have a very lib state. Like California goes like, okay, I want all this conservative content off the platform. So it really doesn't help your side at the end of the day because you're setting a precedent that'll be bad for all states moving forward at the end of the day. So this is one of those things where unless you have unanimous agreement, like the self harm stuff, then we shouldn't be doing something like that. 00:48:23 I do think it's different from book banning, though, because it's a physical object versus social media is so nuanced, it's a lot more difficult to kind of properly enforce. Sure. I think it's a similar set of activists and lawmakers who are advocating to kind of remove teen access to this material in school libraries or on the Internet. Right. It's some of the same information that they want to remove access to. 00:48:53 Right. I think the premise is the same. What I mean by what is different is that like you said, if it's Internet, then they'll overdo it and start banning it over everywhere where it's books. You can just physically get the books and get rid of them. So it's way harder to control that for social media. 00:49:11 That is true. Although book pants present their own problems. Certainly. Yes, I agree to that. One thing you said really struck a chord with me, which is that both Democratic and Republican lawmakers can go overboard if they're given the legal tools to require online platforms to take down content down. 00:49:36 Right. One great example of this is there was a bill proposed in Congress, I think a couple of years ago that would require online platforms to take down health misinformation. Right. What is health misinformation? At the time, it was proposed by a Democratic lawmaker. 00:49:56 So it would have probably applied under a Democratic administration, under the Biden administration to COVID-19 misinformation. Right. But let's say the tables turn. Let's say that law is in place and a conservative president comes in the next year. How do they apply a law that gives them the legal tools to go after online platforms hosting, quote, unquote, health misinformation? 00:50:22 They could apply that to, say, reproductive health care information, something that I don't view as misinformation. But something that they might view as misinformation, right? So then again, you're in the territory of taking down speech, giving these politicians tools to over remove. Do you want a great website like this? This is my podcast website where I direct the audience to come to watch the content, listen to the content, read the blogs, and much, much more. 00:50:52 If you want to have your own customizable podcast website, then join my affiliate link in my description to sign up for something called Pod Page and they can help you customize an easy podcast website for your personal podcast. Sign up to get a discount now. Again, use the link in my description to join Pod Page now. Yeah, that's a great example. Reference to the COVID thing and the whole spectacle that came about, especially when Elon took over and put out the Twitter files for everybody to read and then people interpreted it so many different ways. 00:51:32 I read, I was like, okay, I feel like people are overthinking what is going on here, but it is what it is. Go ahead. I'm glad you brought up Elon because I actually think that some information that came out this week is a little shed some light on how social media platforms, how they're being pushed to improve content moderation to better serve users, right? And that is the fact that Twitter has suffered something like a 59% drop in ad revenue since Elon Musk took over the platform, right? And that drop in ad revenue is directly related to a massive drop in users on the platform who these companies want to advertise to, right? 00:52:24 So in order for a company to gain users and grow their user base and grow their ad revenue, they are incentivized to come up with a content moderation strategy that their users mostly like, right? People don't want to see this stuff that we can agree on is terrible, right? They don't want to see self harm content. They don't want to see anti Semitism and racism and violence when you log in to connect with friends on social media, right? So a platform that does a bad job of moderating for those things will probably see a drop in users and a drop in ad revenue. 00:53:03 And that's exactly what we're seeing at Twitter, right? So it didn't necessarily require a new law passed by congress to penalize Twitter for decreasing content moderation, right? It just required a different content moderation strategy, and the users will find a new social media platform. And what we're seeing today is actually kind of a proliferation of new social media platforms, right? A lot of people from our generation aren't using Facebook as much anymore. 00:53:31 Maybe they've gone to a different social media platform. I mean, there are a host of conservative social media platforms. There's Parlor and truth social, and now I would even almost classify Twitter as one, right? And then there's Mastodon, that's a new social media platform. Blue sky. 00:53:48 I'm not sure if you've heard of this, but these are all new platforms. Some of them are also decentralized. They create different instances with admins that can moderate different parts of the platform with slightly different rules. So users can kind of pick their own flavor of content moderation within a social media platform. And so the private sector is answering a little bit of this question of how do we create an Internet where people like and support the content moderation policies that are being implemented on the platform they're on? 00:54:29 This is a very interesting conversation, I think, when it comes to Twitter, specifically, the conversation of whether or not Twitter should be I forget what they call it, public square. There you go. 00:54:49 The conversation of Twitter, whether or not it should be a public square. My thing is, when it came to Elon, my biggest issue or the reason why I think Twitter is having an issue is mostly because of Elon Musk and his decision to choose a side. I think when you're the owner, when you're the face of the platform that's supposed to be the public square, and it's very obvious that you lean, conservative, red, and Republican, you're obviously going to deter all the people from the left, all the people that are blue. You're going to deter them because they're going to think, oh yeah, this guy's a racist, or whatever. Whether or not they're right or wrong, it doesn't really matter. 00:55:31 Their interpretation will be that, and then they'll choose not to use your platform. Even if people had issues with how Twitter used to be, they never kind of announced themselves as one way or the other. You may have your interpretation of whatnot, but it's way better to make it ambiguous on what the platform kind of represents instead of leaning aside. But that's what I think is the biggest fall of Twitter. Even though I don't like I think Twitter has more issues that's related to people who use Twitter. 00:56:05 But I think in terms of the ad revenue, I think it's mostly because Elon decides to go on this escapade even though it was not smart. It's not even smart for his own company, Tesla. He's not doing a good job in terms of appealing to the people that he wants to appeal to. Well, he's running a space company, too. You're supposed to be ambiguous that way. 00:56:26 You don't have to have these issues. But Elon Musk is always on Twitter having these political debates for no reason. I don't know why he's doing that. I don't think you have to be ambiguous. Right. 00:56:36 But you have to recognize the sacrifices that come with not being ambiguous. If you want to be a public square, you can on the internet, right? But people are going to leave if they don't like what they see in that public square, and they're going to find somewhere else to spend their time online, right? So I think as we kind of continue, as the Internet continues to proliferate, as we continue to see social media platforms proliferate, we're going to see more different content moderation strategies and people will be able to find a platform that suits their flavor of content moderation, where they don't have to see the say far right anti LGBTQ content that they don't want to see. And I believe that's actually most users, right? 00:57:26 So I think that if you want to attract users, you want to have a really big platform. What you probably want to do is have a fair amount of content moderation to kind of tamp down the extremism. There's a theory that's passed around the Internet that social media platform can be similar to a bar in some ways, right? In that once you let a few Nazis in the door, you become known as the Nazi bar. All you have to have is a couple of Nazis that come there regularly to be known as the Nazi bar. 00:58:03 Right? Very true. And nobody wants to go to the Nazi bar is the thing. Right. So you need kind of that thoughtful, powerful content moderation apparatus to build and grow one of the larger user bases online. 00:58:24 So what I will say in reference to that, I do think content moderation should be very thoughtful and it should be very appropriate in terms of what they decide to take off or ban or whatever. My thing is I agree with the sentiment that people want to see what they want to see. Obviously, that's why now they want to. See their friends, they want to connect with repositories. Yeah. 00:58:48 They want to see people that like them. But I think the biggest value of social media was the aspect of meeting people that were not like you. I think as we keep on going in this direction where then this could be hand in hand, why America seems to becoming more divisive as a country is because it's now easier and easier to go more and more into your niche ideology. And my thing is basically doing the exact same thing that it was before the Internet, where you are only know the people around you, only know your community. Now it's easier to find people that are exactly like you in the ideology that you kind of emanate. 00:59:39 But my thing is that the value of social media at the end of the day shouldn't be okay. I want to find all the people that only cares about what I think it should be. Opening your perspective, opening up your experience, and bringing in all of that to have a more wide, encompassing point of view and thought process. I think the biggest reason why we have a lot of issues in terms of conflict, not like violent conflict compared to other places, but in terms of social conflict is because people are so into their own way of thinking that they're not even willing to even entertain the other side. So I agree that that's what people want, but I don't think that's what people should have or that's what people need in terms of how people interact. 01:00:30 That's my personal opinion. Yeah, no, I mean, I agree with you that it's helpful to not have everybody off in their own bubble. Right. But I also think that to engage civilly on the Internet, there actually needs to be a lot of content moderation because sometimes, as we see on Twitter, people resort to just personal attacks, bigotry, and hate speech when they get in disagreement on Twitter. And you can't have a productive dialogue when that's the content that proliferates. 01:01:04 People are very nasty on Twitter. I definitely agree to that. It's kind of weird how companies are going such drastic directions because, for example, got Twitch. It was one way, obviously leaned a very specific way, and people were upset. So what happened? 01:01:17 They made kick and they made Rumble. What happened when they made Rumble? Rumble was like, okay, I want to be so different from Twitch. What happened when you were so different? Okay, I'm just going to drop a bunch of Nazi stuff and N words and all that stuff. 01:01:31 I'm like, do people just not know middle ground? Do you not know the moderation of, okay, I don't want to be too go too far and really just lean this one way or that way. It's just like it's either all the way left, all the way right, and no in between. And that's what social media it does. Sometimes feel like there's a disappearing middle ground. 01:01:55 I hear you. Yes, I definitely think so. Another aspect that you kind of mentioned was in reference to parents and having access to their kids social media and whether or not they would shut it down. Why is that necessarily a bad thing for me? I don't see the difference between them just taking away their kid's phone or turning off the Internet. 01:02:20 I don't really see the difference between that and what they did before. Yeah, okay. I think that there is a level of privacy that comes with, say, a young adult, say, a 17 year old creating a social media account, especially a marginalized teen, and what they're able to say to their peers online and who they're able to connect with and what their profile looks like. Right. There's a difference between a parent necessarily, say, monitoring how much time a teen is spending on their phone and taking the phone away, and the teen monitoring what is the profile that they just created look like? 01:03:03 What are the pronouns that they're using look like? What are the messages that they're sending to their friends look like? Are they accessing LGBTQ resources online? Right. Again, we see harms in the future for marginalized teens and teens from LGBTQ communities, teens without support of parents. 01:03:24 If you provide these parents with more controls to access their teens social media accounts. So while it might be while these tools could be helpful for some parents and their children, they could definitely be harmful to other teens. I definitely see where you're coming from. I'm just thinking on the aspect of will we really see a situation, especially in reference to what they already can do, where giving them more control over the social media will be any different from what they could do in terms of taking the phone away or turning off the Internet? Unless this teen is really just living how the parents want them to, and then only on the Internet, they're living the LGBT kind of lifestyle where the parents have absolutely no idea, then I could see that. 01:04:24 But I feel like even if the parent once the parent finds any type of distinction between how you're acting versus my values, because we all know that there's a lot of crazy, really strict, far right people or very strict religious people who have a way of life. But it's not like I don't see the teen being able to hide that up until they're 17. Eventually this parent is going to find out and they're going to act appropriately. So I don't really see. 01:04:54 Well, first of all, a lot of teens don't even start exploring our identity until they're 17. Right. And then I would also say that most research shows that a lot of LGBTQ teens first start their journey of exploring their identity online and not in the real world. So that often is their first interaction with a lot of these communities and connecting with peers who identify like they do. Right. 01:05:26 And it especially applies, again, every social media law. All these different teen digital social media laws that we're seeing pop up in these states are a little bit different from the next. Right. Some are more severe. Utah's again, for instance. 01:05:41 In Utah's, it requires social media platforms to allow parents to read DMs between their children and their peers. So that, again, could really create problems for some of those marginalized teens. At the end of the day, I am not an LGBTQ person. I didn't have this journey myself. I only know what I've read in the research about how social media can provide kind of a support network for what these teens are going through. 01:06:15 Right. I definitely understand the only reason why I'm offering this devil advocate is because I'm thinking of the realistic nature of how this would go about. And if the parents really wanted to read the DMs, they're going to read the DMs. The team is not going to stop them from reading the DMs. They're just going to take the phone away and read the DMs. 01:06:32 The parent is not going to say no. I don't think a lot of parents. Actually are want to. I don't think none of them want to in the first place. I asked them, like, does this really change how. 01:06:42 They're going to act in terms of a parent in the first place. Because if a parent wants to, they're going to. If a parent doesn't care, they're not going to care anyway. I don't think a lot of parents are necessarily up to speed with the social media platforms that their teens, especially late teens, are using to connect with people on the Internet. Right. 01:07:05 And that dynamic absolutely changes if you're required to get parental consent. When you log into a new social media platform right now, you're not just monitoring, say, Facebook, you're monitoring their Blue Sky access, you're monitoring their Snapchat access, you're monitoring their Twitch. How many parents probably consider the time their teen spends on Twitch on a platform like Twitch? Right. I think parents probably today think about your main three social media platforms in terms of what they should be keeping their eye on, on their teens phone. 01:07:42 And so I think there is a level of privacy that exists today just because teens live in a different world from their parents. I know when I was 17, my parents didn't know how I connected with my friends on I Am or what programs I used to download music. If I told my parents about Lewis, I don't think they had any idea. Yeah, that's very true, very true. 01:08:12 And this is the last thing, and then you can respond if you want and then we can move on to the next thing. I 100% agree that the generational difference. Parents will never truly understand what is going on. And I don't even know, I can't even say what percentage of parents will be cool with it or anything like that. My only thing in reference to the parents and the kids stuff is that as of right now, like you said, the parents not going to know. 01:08:41 And with how social media is, I think current social media moving forward is having more and more of an influence and more and more of an impact on how these kids grow up, how these teenagers grow up, and how they are as people. The impact of social media is wide, encompassing more. So then it's kind of like and I've kind of done my own kind of research about it, it's kind of like how public education was back then and how it influences the children growing up more so than the parents. That's how I kind of believe social media is now and how much kids are actually using it and how social media and its platforms are now. Especially since you can kind of pick your lane. 01:09:25 It's not a public square anymore. Like Twitter is kind of losing the public square notoriety now. You could just pick a specific place to go and then just get delved into that very deeply. So that's why I think parents should have more of a role in how their kid grows up because I think their parents now, I will say there will be those parents that don't understand. There will be those bigoted parents that will maybe super conservative and they have a gay child and they don't want them be gay. 01:09:56 But on the flip side, you can have a situation and we've seen it on social media or you can have a person who's super LGBTQ and the kid just wants to be straight and they don't want them to be like that. We've seen that on social media too. These are exceptions, but they do happen. So when I'm talking about these situations, I don't think parents should have I don't think they should be like an authoritarian over their kids social media. But if we can get them more involved in different ways, especially help them understand what is going on with my kids and what they're consuming, I think that will be very important. 01:10:32 More so, at the very least, more so in the state governments deciding what content moderation is going on. Yeah, you bring up a lot of good points. At the end of the day, there does need to be a serious conversation between parents and their kids about responsible social media use. And it's a good thing that we see a lot of these social media platforms again offering increased tools as they face this regulatory pressure to parents in order to monitor their children's social media use. But when it comes to these bills, it is an assessment of do the benefits outweigh the harms? 01:11:12 Sure. In our assessment, we believe that the privacy harms, requiring everybody to show their ID at the door of these social media websites, that the harms to marginalized teens and teens who don't have supportive parents outweigh the benefits of these laws. Definitely understand. Are there any other laws that we haven't discussed in referencing social media? I thought you were going to just say tech laws. 01:11:43 And I was like, that would require a whole nother podcast. I agree. Social media laws, I mean, there's a bill proposed at the federal level that would implement a similar parental control measure across the country that is probably not going anywhere in this Congress. The most action we're seeing, the most bills we're seeing pass is all at the state level. It's happening in states. 01:12:15 And largely these parental control laws are passing in Republican states rather than Democratic states. Right? Yeah. The other area of social media and content moderation that is still up for debate right now that is still being considered are related to laws that prohibit social media platforms from moderating content and specifically political content. Right. 01:12:43 So a couple of years ago, Florida and Texas passed laws that would prohibit social media platforms from taking down political viewpoint points. Right. And the laws say this in different ways. Right. Like, Florida's law prohibits any social media platform from taking down any political candidate. 01:13:08 If they're running for office, you can't take them off your platform. Texas's law prohibits you from taking down content posts that are political viewpoints. Of course, at the end of the day, what is a political viewpoint? Right. Right. 01:13:23 Very hard to define Nazism a political viewpoint because, again, we should be taking that stuff down. Viewers don't want to see that. Users don't want to see that. It's not helpful to our political discourse. Right. 01:13:34 So those laws are actually going through the courts right now who are kind of going to make the other important half of this decision, aside from what state lawmakers are doing on what regulations social media platforms face and the content that they can show us. Right. These courts are determining right now whether those laws are in violation of the First Amendment, because, again, the First Amendment applies not only to what we can say, but what we can see in this case. The courts are considering whether these laws prohibiting content moderation violate the First Amendment rights of the social media platforms themselves, which have a First Amendment right to determine what they host on their own page, what kind of content they're willing to publish, because you cannot force publishers to publish speech that they don't want to publish. Yeah, that's very interesting. 01:14:36 Obviously, this is something that's more of a conservative thing that they wanted to push for obvious reasons. But I'm curious to something like the Daily Wire, would they publish left leaning material on their platform, is my question. I wonder what their thoughts about that would be, and referencing that. Well, if we force them to, it'd be a violation of the First Amendment. Yeah, that's, that that's why I feel like a lot of this terminology definitely needs to be kind of distinguished. 01:15:03 I don't know, like you said, do they have lines in the sand on what is a political viewpoint, as they say, to make sure that we're not putting out very disgusting thing. I see some people who defend the First Amendment say, like, oh, yeah, we should have everything, and I have conversations about all the amendments, and I have deep conversations, but I feel like there are certain things we don't need on the platform. I don't think, as they debated this Texas law, so as they debated this Texas law, there were amendments that were brought up to the bill that would have carved out, say, anti Semitism. I think that was specifically one of the amendments from the legislation allowing platforms. To are you enjoying today's podcast episode? 01:15:54 I really hope you do, and I really hope you enjoy the fact that I have an amazing guest talking with me and having this great discussion. If you as an individual personally have your own podcast, and maybe you want to have great guests on your podcast as well. Well, I got a deal for you. In my description, there is a link to something called Pod Match. Make sure to join that link through my affiliate link so you can sign up to get matched up with other podcast hosts and podcast guests, so you make sure you are never missing an episode without a productive guest to have an amazing conversation with. 01:16:33 Pod Match is similar to any other kind of matching site for the most part, and it's super easy. You just $6 a month. You can have a guest for each and every podcast episode that is tailored to your specific topic. So again, join the link in my description and join Pod Match now. Sorry. 01:16:55 So I'm not sure where I left off, but the Texas legislature yeah. Was considering amendments to this social media law that would prohibit platforms from taking down viewpoints. One of the amendments said that anti Semitism would be accepted, that platforms would be allowed to take down anti Semitic content even if it was a, quote, unquote viewpoint, and that amendment, at the end of the day, wasn't accepted. Right. So with the law in effect, would platforms be able to actually take that content down? 01:17:27 It's a question at the end of the day that the courts would have to decide. But to me, again, it's much better to have to leave platforms with the room to take down that harmful content. Yeah, I'm curious on how that kind of plays about with the because recently there was like an anti Semitism law that was passed or put in place to kind of ensure that we're teaching more about that's not okay. Basically, that's what I'm trying to do with it. So they're definitely trying to push that one. 01:18:00 I just hadn't heard of it. No, it's all good. I mean, makes sense, especially with Kanye doing what he was doing. So obviously they're going to put that in place to make sure specific group of people aren't attacked because it seemed like it was growing a little bit. It'll be the Supreme Court that decides these laws too, by the way, is where it's all headed. 01:18:22 So it will be interesting to see, especially with a conservative Supreme Court, what they decide about these laws passed by Texas and Florida. Yeah, I'll be very interested to see what they decide to do. So my last thing in reference to the discussion that we're having over here is for you personally in terms of social media laws and how they should be implemented for teens, what do you think is the most ideal way that social media should be used and maybe regulated for teenage use moving forward? Yeah, I would like to see one of the laws that we supported passing recently. Earlier this year provided additional funding for the National Institutes of Health to research the impact of social media on teen users, to get more information about where any mental health impacts are coming in, how we can protect these users effectively, and where social media can be beneficial. 01:19:27 Right. And how we should be protecting that beneficial access. So I think that as we see that research come to fruition, again, that funding actually passed. So that's great news as we see that NIH research come to fruition. I'm excited to see that, and I think that that will be helpful in implementing some of these policies. 01:19:46 When it comes to teen use of social media, I think one of the most important things to keep in mind is, again, that when we require online platforms to implement separate services and protections and safeguards for young users than they do for adult users and we require age verification, we create privacy risks, right? So wherever we can provide basically protections that apply to the entire public rather than just for teens, that's helpful. And it actually avoids creating additional risks for users. So privacy is one great example, right? If we want to pass privacy protections, we should pass them for adult users. 01:20:32 For teen users, it shouldn't matter what age you are to have your personal data protected online. There should be holistic federal law that protects the privacy of your personal data. Right. If we create separate safeguards for teen users, again, you actually expose them to more risks because we all have to agree to an age verification process, and there is no age verification process that protects your personal data. 01:21:06 All right, excellent. Those would be my recommendation. All right, excellent. A lot of what you said sounds very good, especially the parts referencing mental illness, because you remember the study I did peek that study as well, and how mental illness in teens have grown due to social media. So my big thing in referencing social media that I've kind of been reinforcing a lot today is technically you don't even need any of these laws to be a parent. 01:21:41 Right? You don't need any of those laws to pay attention to what your kids doing. Pay attention to how they're spending their time on social media and be involved in their life. Do not let the Internet raise your kids. Do not let the Internet raise your kids. 01:22:00 If you don't want them to grow up a certain way, then, I don't know, maybe do some parenting. So that's my biggest thing in referencing social media. And that's why I say that I feel like my generation or the generations that's close to me reference relatively that grew up before social media, on the rise of social media. Now it's like peaking and even going even further up. I think we should have enough nuanced perspective to understand that social media can be very bad if it's used too much. 01:22:38 I think we all understood this problematic. Use is never good. Yeah, exactly. So that is all we got for today. I think this was an excellent discussion in referencing social media and teens. 01:22:51 Do you have any final words? No, I just really appreciate you hosting me, and I really appreciate the thoughtful conversation on a hard issue. Right. Thank you. Yeah, no problem. 01:23:03 I thought it was an excellent conversation to dive into and very nuanced when it comes to social media, especially in referencing the First Amendment and the rights of not only the people, but the corporations that are the social media companies and what rights they have to kind of elicit what kind of content on their platforms, respectively. Well, listen, if you ever want to talk about another tech policy, whether that's artificial intelligence or the deployment of autonomous vehicles, we're ready. Yeah. AI. Is the topic of conversation today. 01:23:41 Oh, man. I don't know how I feel about AI. It's so helpful, but I'm, like, I feel like there's so many ways this can go so bad. Hot take of the day. I think AI. 01:23:54 Is actually going to end up benefiting most of our workforce and that it's going to be more of an equalizer in the workplace than we expect. Hey, I'm willing to hear I'm ready. To defend it, all right? I'm willing to hear it out, because I know those terminator skeptics out there are kind of concern, I'm sure, but. Hope you guys enjoyed today's episode. 01:24:14 Per usual, obviously, you could find, I guess, information on the website rate it five stars, of course. Tune in for the next week episode. You all have a good one. Take care and peace.

Communications Director
Chris MacKenzie is the Communications Director for the Chamber of Progress, a new center-left tech industry policy coalition promoting technology’s progressive future. A creative communications professional with a decade of Democratic politics under his belt, Chris has a strong background in connecting with center-left voters and thought leaders through digital media campaigns and traditional press outreach.
A veteran of Capitol Hill and the campaign trail, he most recently served as Communications Director for former New Democratic Coalition and Blue Dog Coalition-member Rep. Kendra Horn (D-OK). His high-visibility approach to public affairs mobilized support for Rep. Horn in a conservative state, and on Election Day, Rep. Horn was one of less than a dozen competitive Democratic candidates nationally to outperform President Joe Biden in their congressional district. Leading communications for Rep. Horn, who sat on the House Committee on Science, Space, and Technology, Chris regularly wrote on issues impacting America’s tech leaders and businesses.
Chris previously served under two other New Democratic Coalition-members, former Rep. John Delaney (D-MD) and Rep. Terri Sewell (D-AL). Chris worked as a communications aide under Rep. Delaney when Delaney founded the A.I. Caucus to discuss issues of artificial intelligence with House lawmakers. With Rep. Sewell, Chris led communications and managed a public affairs campaign promoting the Voting Rights Advancement Act, a top Democratic priority in Congress today.
In addition to his work on the Hill, C…Read More













