Written Testimony of Joel Thayer, President, The Digital Progress Institute
Before the Texas House Committee on Trade, Workforce, and Economic Development
Hearing on HB 4901, “The App Store Accountability Act”
April 15, 2025
My name is Joel Thayer. I am an attorney and president of the Digital Progress Institute based in Washington, D.C. I am here today to testify in favor of HB 4901 or the App Store Accountability Act (the “Act”). I am well positioned to speak on the bill as I developed the legal and policy framework on which SB 2420 is based.
Before getting into that, let’s begin with why we are even here — to protect children.
The sad reality of the digital age is that moms and dads have been left on their own to fend off a tech-induced health crisis, contending against the allure of products engineered by the most powerful corporations in history to be maximally addictive to kids and almost every app tracks our children’s data without their parent’s consent. Worse, Big Tech is not only indifferent to the harms children are suffering from its products, but there is also strong evidence that they intentionally perpetuate the problem.[1]
That’s why the Act you are considering today is critical to ensuring parents can indeed parent in the digital age.
Let’s start with the model’s underlying theories. I designed the framework with two principles in mind. The first was that it is structured to regulate conduct, not content. It relies on a standard legal principle — multi-trillion-dollar companies cannot enter into sophisticated contracts with minors. Make no mistake, when you use an app store, you are entering into a contract via terms of service and privacy policies with Apple, Google, and third-party developers to access a whole suite of digital products. The regulation is legally indistinguishable from any other commercial regulation. Mainly because, in commercial transactions, the sellers and distributors are generally required to know whether they are engaging with a minor or at the very least know the identity with whom they are contracting.
This is especially true for the sales and distribution of products. It is why this proposal leverages a standard policy prescription to prevent children from accessing addictive services or products — the onus is on the store to age gate the product. When you walk into a convenience store, we require the store to check for an ID when a patron purchases cigarettes, alcohol, and pornography. We also hold the store liable when kids access those products, not necessarily the suppliers of the product. In other words, we don’t rely on Philip Morris or Anheuser Busch to ensure kids aren’t purchasing their products, we look to CVS, 7–11, and supermarkets to age gate.
What is more, we generally don’t allow a child to obtain a bank loan without a parent present or at the very least require them to co-sign for the loan.
The app ecosystem should be no different.
To ensure its constitutionality, the framework applies to all contracts minors may encounter on an app store, instead of singling out any particular service or content. It is why the fact that the Act applies to all apps is very much a feature not a bug. This feature demonstrates to courts that “[t]he legislation…[is] directed at unlawful conduct having nothing to do with…the expressive activity.”[2] In this case, we are making clear that we are worried about a company’s formation of a contract with a minor without a parent or guardian’s oversight; we are not trying to prevent them from accessing or engaging on a particular app outright. If the parent or guardian wants to allow their child to download an app or have no child restrictions at all on app downloads, the Act would permit that. Full stop.
This approach is distinct from what the state of Ohio attempted when it passed the Parental Notification by Social Media Operators Act (“Ohio Act”).[3] The Ohio Act requires “operator[s]” of “online web site[s], service[s], or product[s]” that (1) “target[] children,” or are “reasonably anticipated to be accessed by children” to “obtain parental consent before allowing any unemancipated child under the age of sixteen to register or create an account on their platform.”[4] Specifically, the Ohio Act required a platform to “[o]btain verifiable consent for any contract with a child, including terms of service, to register, sign up, or otherwise create a unique username to access or utilize the online web site, service, or product, from the child’s parent or legal guardian” through a variety of acceptable methods; and (2) present to the parent or guardian a list of features related to content moderation and a link where they may review those features.”[5]
The Ohio Attorney General attempted to rely on the premise that “the [Ohio] Act does not regulate speech, simply the ability of minors to contract….”[6] He further argued that “the legislation is concerned with operators’ release of minors’ personal information and data pursuant to exploitative terms of service, addictive social media features like “infinite scroll,” increased rates of mental illness in children, and a risk of exposure to sexual predation on websites that facilitate private messaging between users.”[7]
The Court disagreed with AG Yost’s justification because, at the outset, the Ohio Act was clearly seeking to regulate content, not the conduct of regulating contracts. Why? Because, for one, the Ohio Act excluded a whole host of other sites that have the same features and capabilities to collect a child’s data.[8] Secondly, the Court held that the “[Ohio] Act…certainly requires consideration of the content on an operator’s platform [because the State had] to determine if [the website] “targets children” or is “reasonably anticipated to be accessed by children.””[9] For these reasons, a court found this law a content regulation.
The Act here avoids this issue entirely and is likely to still be upheld as a conduct regulation. As previously stated, the Act seeks to regulate all contracts, not just ones that “target children” or are “reasonably anticipated to be accessed by children.” Indeed, the Act is indifferent to whether a child is downloading a bible app or TikTok. If the app has a terms of service or privacy policy, then it requires the app and the app store provider to seek parental consent after they have determine the user is a child. As was the case with the law the Supreme Court reviewed in City of Austin, Texas v. Reagan National Advertising of Austin, LLC, Texas’s App Store Accountability Act does not “single out any topic or subject matter for differential treatment.”[10] Thus, this type of regulation is closer to the health code violation in Arcara or the divestiture requirement in TikTok v. Garland.[11]
Moreover, enlisting app store providers to perform age verification balances the State’s goal of providing parents the legal recourse to protect their children from harmful tech services, while not infringing on adults’ online speech. Why is this the case? Well, as Jonathan Haidt rightly put, “…with device-based verification nobody else is inconvenienced.”[12] A parent verifies their child’s device once with the app store and they’re done. “[T]he internet is unchanged for them”[13] and they are still able to control what their kids see and do on their devices.
This dovetails into the second principle. The framework was developed to make sure we used stakeholders’ existing infrastructure and not reinvent the wheel. Indeed, placing the age-gating responsibility on app stores reduces the costs of age verification on parents, kids, adults, and app developers (large and small).
But how?
Well, app store providers, like Google and Apple, control every aspect of their app marketplace. As Federal Communications Commission Chairman Brendan Carr has put it, app stores are “the single choking point” of the mobile ecosystem.[14] Every service goes through either one of two app stores — Apple’s App Store and Google’s Play Store. App stores provide the front door to every addictive and harmful product to kids — and requiring them to verify the ages of users and communicate with the parents of minors streamlines the process and removes the burden of every app developer from having to verify ages, and every adult from going through yet-another age verification process whenever they access a new app.
What is more, the framework builds off of studies performed by Jonathan Haidt and myriad civil society groups that have all identified devices and their app stores as the primary vessel to address the issue of age verification and parental consent.
This brings us to why the Institute supports the Act: it is in lockstep with our framework’s structure, legal theory, and rationale. Indeed, the Act’s focus is on contracting, not content. For instance, it adopts the State’s age of majority (i.e., 18 years old) to sign a terms a service. It empowers parents to know what their kids are downloading and purchasing on apps using mobile app stores. Better yet, it does not single out any particular app or online service to maintain its constitutional integrity as explained above.
The Act as currently drafted is a conduct regulation plain and simple. It is concerned with denying Big Tech, small tech, and everyone in between from taking advantage of minors through deceptive, confusing, and vague contracts. To be sure, the Act maintains no fundamental legal difference than other similarly situated commercial regulations that Texas has previously enacted. For instance, even though § 34.305 of the Texas Financial Code allows a minor to open a bank account, it subjects the account to a parental or guardian veto to “deny the minor’s authority to control, transfer, draft on, or make a withdrawal from the minor’s deposit account….”[15] Or Texas’s Data Privacy and Security Act that prevents tech companies from using vague privacy policies to collect, use, store, sell, share, analyze, or process consumers’ personal data without their consent.[16] Texas even provides parents a right to “consent to the child’s marriage, enlistment in the armed forces of the United States, medical and dental care, and psychiatric, psychological, and surgical treatment.”[17]
Given Texas’s strong track record on passing consumer protection and child safety laws, there is no policy reason why it should draw the line at contracts on app stores, especially when considering the high stakes it has on children’s mental and physical health and development.
The Act even avoids the obvious privacy objection that Big Tech organizations like to lodge against age verification measures at the website tier. App stores already have all of this age information. This means that the user would not need to proffer more data to these platforms — a distinct characteristic from website-level age verification requirements. Apple and Google already have the tools to take these child-safety measures and are already required to do so in certain contexts.[18] It is why Jonathan Haidt is correct that this type of age verification presents “no privacy threat whatsoever.”[19]
In sum, the Act holds app stores accountable when they intentionally fall short in protecting our children, and passing the Act is an essential step forward in putting families first. We are proud to support it.
Sincerely,
Joel L. Thayer
President & Member of the Board
[1] E.g., Aaron Tilley, Apple’s App Store Puts Kids a Click Away from a Slew of Inappropriate Apps, The Wall Street Journal (Dec. 22, 2024), https://www.wsj.com/tech/apples-app-store-puts-kids-a-click-away-from-a-slew-of-inappropriate-apps-dfde01d5; see also, Joanna Stern, How Broken Are Apple’s Parental Controls? It Took 3 Years to Fix an X-Rate Loophole, The Wall Street Journal (Jun. 5, 2024), https://www.wsj.com/tech/personal-tech/a-bug-allowed-kids-to-visit-x-rated-sites-apple-took-three-years-to-fix-it-17e5f65d.
[2] Arcara v. Cloud Books, Inc., 478 U.S. 697, 707 (1986).
[3] Ohio Rev. Code § 1349.09(B)(1).
[4] NetChoice v. Yost, 716 F.Supp.3d 539, 547 (S.D. Ohio 2024) (citations omitted).
[5] Yost, 716 F.Supp.3d at 548.
[6] Id. at 553.
[7] Id. at 555–56.
[8] Id. at 560 (writing that “a child can still agree to a contract with the New York Times without their parent’s consent, but not with Facebook.”)
[9] Id. at 557.
[10] 596 U.S. 61, 71 (2022).
[11] 145 S.Ct. 57 (2025).
[12] Jonathan Haidt, The Anxious Generation: How the Great Rewiring of Childhood Is Causing an Epidemic of Mental Illness, p. 239 (2024).
[13] Id.
[14] Hon. Brendan Carr, X Post, February 12, 2024, https://twitter.com/BrendanCarrFCC/status/1757145971295695226?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E1757145971295695226%7Ctwgr%5E761fa5f7abcd99dca5c6e65a6bc1f44570c53b67%7Ctwcon%5Es1_&ref_url=https%3A%2F%2Fwww.theverge.com%2F2024%2F2%2F12%2F24071226%2Ffcccommissioner-brendan-carr-apple-beeper-mini.
[15] Tex. Fin. Code § 34.305(c).
[16] Tex. Bus. & Com. Code § 541.055.
[17] Texas Family Code § 151.00.
[18] Federal Trade Commission, FTC Approves Final Order in Case About Google Billing for Kids’ In-App Charges Without Parental Consent, Press Release (Dec. 5, 2014), https://www.ftc.gov/news-events/news/press-releases/2014/12/ftc-approves-final-order-case-about-google-billing-kids-app-charges-without-parental-consent; Federal Trade Commission, FTC Approves Final Order in Case About Apple, Inc. Charging for Kids’ In-App Charges Without Parental Consent, Press Release (Mar. 27, 2014), https://www.ftc.gov/news-events/news/press-releases/2014/03/ftc-approves-final-order-case-about-apple-inc-charging-kids-app-purchases-without-parental-consent.
[19] Jonathan Haidt, The Anxious Generation: How the Great Rewiring of Childhood Is Causing an Epidemic of Mental Illness, p. 239 (2024).