Mixed-Age Audiences and COPPA: What EdTech Founders Must Do When Teens Use Your Platform

COPPA's April 22, 2026 deadline has passed. Most EdTech platforms with teen users are already out of compliance with the amended rule's mixed-audience requirements. What triggers the obligation, what the general audience defense covers, and how to design a consent flow that survives FTC scrutiny.

Mixed-Age Audiences and COPPA: What EdTech Founders Must Do When Teens Use Your Platform
Loading AudioNative Player...

What Triggers the Mixed-Audience Obligation

The 2025 COPPA amendments, published April 22, 2025 and carrying a compliance deadline that passed on April 22, 2026, created a formal third category of regulated platform under 16 CFR § 312.2: the mixed-audience website or online service. This is not a gray area or a matter of interpretation. If your platform is directed to children but does not target children as its primary audience, you fall into this category and you must implement a neutral age-screen before collecting any personal information from any user.

For most EdTech founders, the threshold question is whether the platform is "directed to children" at all. The FTC answers that with an eight-factor test under 16 CFR § 312.2, and K-12 platforms routinely satisfy several of these simultaneously: subject matter tied to academic curricula, visual content and characters calibrated to school-age users, and, most importantly, actual audience composition data showing children among your users. You do not need to fail all eight factors to trigger the rule. Satisfying a combination is sufficient, and the FTC weighs them holistically.

The factor that trips up most EdTech operators is the last one: empirical evidence of actual audience composition. If your platform serves K-12 students through school contracts, those contracts are documented proof that children use your product. The FTC has stated explicitly that operator representations to schools about a platform's youth audience constitute evidence of child-direction, regardless of whether the platform also sells to adult learners or enterprise customers. A B2B go-to-market strategy does not insulate you from COPPA if the end users sitting behind those contracts are minors.

The compliance deadline has already passed. The FTC's amended COPPA rule took effect June 23, 2025, with a full compliance deadline of April 22, 2026. If your platform serves any combination of child and teen users and you have not implemented a neutral age-screen, you are currently out of compliance.

The financial stakes make miscategorization a company-level risk. Civil penalties run $53,088 per violation under the FTC's inflation-adjusted schedule at 16 CFR § 1.98, and violations are measured per affected child. In United States v. Google/YouTube (2019), that math produced a $170 million settlement. In the FTC's 2022 action against Epic Games (Fortnite), the total settlement reached $520 million — including a record $275 million COPPA civil penalty and $245 million in dark-pattern refunds. For a growth-stage EdTech company, a single enforcement action at that scale is not survivable.

The practical upshot: if your platform has any K-12 users, whether through direct sign-up, school district contracts, or a freemium tier marketed to educators, you should assume the mixed-audience category applies until you have done a formal analysis under the eight-factor test. The risk of assuming you fall outside COPPA's scope is orders of magnitude greater than the cost of building the age-screen. For a full breakdown of what that compliance infrastructure looks like under the amended rule, Promise Legal's COPPA compliance guide for startups walks through each requirement in sequence.

The General Audience Defense — and Why It Fails Most EdTech Platforms

The most common argument EdTech founders make to avoid COPPA is some version of: "We sell to schools, not to kids." That argument has a name in the regulatory framework — the general audience defense — and it has a specific, narrow legal standard under 16 CFR § 312.3(b) that K-12 platforms almost never satisfy. The defense requires that you have no actual knowledge that any user is under 13. Not most users. Not a class of users. Any user, anywhere on your platform.

The FTC defines actual knowledge with deliberately low precision. Per agency guidance, if your age-neutral registration form captures a user entering a birthdate that indicates they are under 13, you cannot discard that data and continue treating the platform as a general audience service. A parent complaint email referencing a child account triggers the same result. So does a school district contract that names a K-8 student population as the end user. These are not edge cases — they are the core business facts of most EdTech companies, and each one is individually sufficient to destroy the general audience defense for at least some portion of your user base.

United States v. TikTok Inc. (formerly Musical.ly, 2019) is the clearest enforcement illustration. The platform nominally required users to be 13 or older. The FTC held that the combination of child-directed content, advertising calibrated to young audiences, and internal knowledge of underage users constituted actual knowledge sufficient to disqualify the general audience defense. The resulting consent decree carried a $5.7 million civil penalty, the largest COPPA penalty at that time. A 2023 follow-on action alleged violations of the earlier order. Age gates alone are not a defense when the surrounding facts show the operator knew children were present.

For EdTech specifically, the FTC has made its position explicit. In its 2022 policy statement on ed tech surveillance, the agency stated: "Companies that knowingly collect children's data as part of their core educational technology business are on notice that the FTC expects them to comply with COPPA." School district contracts are documentary evidence that you knew your users included minors. The FTC will not credit a general audience claim from a company whose sales team spent six months closing a contract with a middle school district.

The 2025 amended rule tightened this further. The FTC's commentary on the final rule explicitly calls out platforms that design products with features appealing to children while claiming a general audience designation, describing it as an evasion the rule is specifically intended to address. Passive ignorance is no longer a viable position for mixed-age platforms.

The practical read for founders: if you have a single school contract, a single self-reported age indicating a user under 13, or a single parental inquiry about a child account, the general audience defense is unavailable to you for COPPA purposes. You are in the mixed-audience category and the age-screen obligation applies. The COPPA compliance framework for mixed-audience platforms requires neutral age-screening before any personal data collection — not a disclaimer, not a checkbox, and not a 13+ terms-of-service provision that you are not actually enforcing.

If your platform serves children under 13, you need verifiable parental consent (VPC) before collecting any personal information. The 2025 amendments to 16 CFR § 312.5 expanded the menu of acceptable VPC methods from five to nine, giving operators more options but also imposing a new structural requirement that most EdTech platforms are not currently meeting.

Under the amended rule, you may collect verifiable parental consent through any of the following: (1) a signed written consent form returned by mail or fax, (2) credit card verification tied to a transaction, (3) a video conference between the operator and the parent, (4) government-issued ID verification (new in 2025), (5) facial recognition matching against that ID (new in 2025), (6) knowledge-based authentication using questions derived from public records (new in 2025), (7) email with additional confirming steps, (8) telephone or electronic scanning, or (9) text message with additional confirming steps. The three new methods give EdTech platforms workable digital-native options that did not exist under the prior rule. For most product flows, government ID verification or knowledge-based authentication will be the operationally realistic path.

This is the change that will require the most immediate attention. Under 16 CFR § 312.5(a)(1)(ii), if your platform discloses a child's personal information to any third party, you must obtain a separate parental consent specifically authorizing that disclosure. That consent cannot be bundled into your general terms of service or rolled into your baseline data collection consent. It must stand alone.

The practical consequence is significant. Every third-party SDK you have embedded in your platform — analytics tools, crash reporting services, advertising networks, even some payment processors — that collects personal information from child users requires its own authorization from the parent. If you are running five SDKs and have one consent flow, you are already out of compliance under the amended rule. Audit your SDK stack now, map which ones touch user data, and build consent pathways that address each disclosure separately. The COPPA compliance framework for startups includes a data-flow mapping approach designed for exactly this audit.

The Email-Plus Shortcut Has Narrow Limits

Email-plus, the method where you send a confirmation email to a parent after collecting initial information, remains available under 16 CFR § 312.5(b)(3). But its scope is narrower than most platforms assume. Per FTC guidance, email-plus is appropriate only where the operator's website will not use the information for any purpose other than the operator's own internal use and will not disclose it to third parties. If your platform shares data with anyone outside the organization, you cannot use email-plus for that data. Platforms that use email-plus as a one-size consent mechanism while simultaneously running analytics SDKs or sharing data with content partners are misapplying the method.

Under 16 CFR § 312.5(a)(3), during the age-screening phase, you may collect only the minimum information needed to determine whether a user is a child. Date of birth collected to run the age calculation is permissible. Collecting an email address, full name, or school affiliation at registration before consent is obtained is not. Many EdTech platforms present a full profile-creation form upfront and then ask for age at the end of the flow. That structure violates the data minimization requirement for the pre-consent collection period.

Anti-conditioning rule (16 CFR § 312.7): You cannot require a child to disclose more personal information than is reasonably necessary to participate in the activity as a condition of access. Requiring a full profile — name, school, grade level, photo — before a child can use a reading tool or take a quiz likely violates this rule if that data is not genuinely necessary to deliver the core experience. Evaluate every required field in your onboarding flow against this standard.

State Laws Protecting Teens 13–17

COPPA draws its line at 13. States have drawn their own lines at 18, and those lines carry enforcement teeth that operate independently of the FTC. Three frameworks are currently operative, and a fourth is moving through Congress. The critical rule: these laws stack. Complying with California does not satisfy Texas. Complying with both does not satisfy South Carolina. You identify the strictest applicable requirement for each specific obligation and meet that standard across every state where you have users.

California: Age-Appropriate Design Code

California's Age-Appropriate Design Code (AB 2273, Cal. Civ. Code § 1798.99.30 et seq.) faces significant legal uncertainty following the Ninth Circuit's March 2026 ruling in NetChoice, LLC v. Bonta. The court enjoined five of six challenged provisions; a sixth was remanded for further proceedings. As of this writing, none of the Act's substantive requirements are in effect pending resolution of the litigation.

Monitoring the CA AADC's litigation trajectory remains strategically important. The Act's design requirements — age estimation without demanding documentary proof, default-privacy-settings limited to what is strictly necessary for teen users, and safe messaging guidelines for topics such as suicide and eating disorders — represent the direction California is moving even while the injunctions hold. Structuring your teen-user architecture to accommodate these requirements now makes compliance less disruptive if the provisions are ultimately upheld on remand or in subsequent proceedings.

Texas: SCOPE Act

Texas enacted the Securing Children Online through Parental Empowerment (SCOPE) Act (HB 18, 2023; Tex. Bus. & Com. Code § 509) in 2023. Litigation in the Fifth Circuit blocked the age verification and advertising restriction provisions, but the data minimization requirement is in effect now. If your platform targets minors or has actual knowledge of minor users in Texas, you must limit data collection to what is necessary to provide the service, prohibit sale of minor data, and restrict targeted advertising based on sensitive data categories (NetChoice, LLC v. Paxton, 5th Cir. 2024).

The Texas obligation is not limited to under-13 users. The SCOPE Act covers under-18. If your EdTech platform has Texas teen users, that data minimization floor is already in effect regardless of whether those users ever triggered your COPPA age-screen workflow.

South Carolina: The Most Aggressive Framework

South Carolina enacted its own Age-Appropriate Design Code in 2026 (S.C. Code §§ 39-5-840 through 39-5-870), and it goes further than California on enforcement. The statute includes a private right of action, treble damages for willful violations, and individual officer and director liability. It also requires a DPIA before launching teen-facing features, an obligation that remains enjoined at the federal level under California's version but is operative in South Carolina. A single willful violation by your platform in South Carolina does not just create regulatory exposure; it creates personal liability for your executive team and a litigation target for plaintiffs' attorneys.

If you have users in South Carolina, the DPIA requirement is not optional. Conduct and document it before shipping any feature that reaches minors, and structure it to demonstrate that each data practice is necessary and proportionate to the service you provide. For more detail on building a multi-state teen privacy compliance program covering AADC, SCOPE, and parallel state privacy laws, Promise Legal's compliance guide covers the operational workflow.

The stacking rule in practice: A platform with users in California, Texas, South Carolina, and New York faces teen-privacy obligations under at least four distinct state frameworks (AADC, SCOPE, SC AADC, and New York's SAFE for Kids Act) on top of COPPA. Each framework has distinct timelines, enforcement mechanisms, and liability exposure. The most restrictive provision for each specific obligation governs. There is no single-state compliance shortcut.

Pending Federal Legislation: KOSA

The Kids Online Safety Act (KOSA, S. 1409) passed the Senate in July 2024 with broad bipartisan support but was not enacted — the bill did not receive a House vote before the 118th Congress ended. KOSA is not current law; it would need to be reintroduced in the 119th Congress to advance further. The bill's scope nonetheless illustrates the federal direction: a duty of care on platforms serving minors, covering harmful content recommendations, age-restricted purchases, and parental controls, backed by FTC enforcement authority and civil penalties. If a successor measure passes, it would set a federal compliance floor above which state laws continue to add requirements — not a ceiling that preempts them. Build your architecture now to accommodate an additional federal layer, because retrofitting a deployed platform is substantially more expensive than designing for it upfront.

The four compliance frameworks covered above — COPPA's verifiable parental consent rules, state age-appropriate design codes, data minimization obligations, and the anti-conditioning rule — all converge at the same operational chokepoint: your registration and onboarding flow. Every EdTech platform that serves a mix of children, teens, and adults must answer the same architectural question before a single field is presented to a user: who is this person, and what data am I permitted to collect from them right now? A two-gate architecture answers that question systematically and, when implemented correctly, dramatically reduces the enforcement exposure that flows from getting it wrong.

Gate 1: The Neutral Age Screen

Gate 1 exists for one purpose only — to determine the user's age category before any other data is collected. Under 16 CFR § 312.5(a)(3), the only personal information you may collect at this stage is the minimum necessary to make the age determination. Date of birth collected to run the age calculation is permissible. An email address, a full name, a school affiliation, or a grade level are not permissible at this stage, because none of them are necessary to determine age. Most EdTech founders invert this logic by presenting a complete profile form first and appending an age question at the end — the equivalent of building out an entire user record and then checking whether you were allowed to collect it.

The screen itself must be designed to be neutral, not to steer. The FTC has flagged specific patterns as inadequate: defaulting the age selector to a value above 13, making the "I am over 13" option the visually dominant or easier tap target, placing the age question after a lengthy profile creation flow that creates sunk-cost pressure to answer a particular way, or using language that signals what answer is expected. None of these violate COPPA by accident — they reflect deliberate product choices to minimize the population that triggers the parental consent workflow, and the FTC reads them as such. The screen must present the age question cleanly, without visual hierarchy that advantages any particular response, and it must appear before any other data collection begins.

Once Gate 1 identifies a user as under 13, all data collection stops until verifiable parental consent is obtained. No analytics event fires. No session ID is assigned. No email is collected from the child to send a consent request. Under 16 CFR § 312.5(a)(1)(ii), if your platform will share that child's personal information with any third party, the consent you collect at Gate 2 must specifically authorize each such disclosure — a single "I agree to the terms" checkbox cannot serve as consent for both general data collection and third-party sharing. Each third-party SDK that will touch child data requires its own parental authorization, documented separately.

The practical implication runs deeper than your consent UI. Every third-party SDK embedded in your platform — analytics tools, crash reporters, A/B testing frameworks, customer support widgets — that fires on page load or on any event reachable by an uncleared child user is a live COPPA violation for every child who reaches that point. An analytics SDK that fires before Gate 2 is cleared constitutes unauthorized data collection and disclosure for each affected child. Mapping your SDK calls against user consent tiers is not an optional audit step; it is a prerequisite to deploying any new third-party tooling into a section of the product that child users can reach.

Three-Tier Routing After the Gates

With both gates implemented, your platform routes users into one of three tiers, each with distinct data governance obligations. Confirmed users under 13 proceed under full VPC requirements: no data collection, no third-party SDK calls, and no profile expansion beyond what consent specifically authorizes. Users aged 13 through 17 operate under a data minimization default — profiling is off, targeting is restricted, and the state-specific requirements from California, Texas, and South Carolina apply based on where those users are located. Confirmed adults proceed under your standard data practices. These tiers are not internal labels; they are enforcement categories with distinct legal consequences, and your data infrastructure needs to enforce them at the technical level, not just document them in a privacy policy.

Pre-launch SDK audit checklist: (1) Map every third-party SDK that calls out on any page reachable by uncleared users. (2) Identify whether each SDK collects or transmits any personal information, including device identifiers, IP addresses, or behavioral data. (3) Confirm that no SDK fires for a child user before Gate 2 consent is cleared. (4) For SDKs that touch child data post-consent, verify that parental authorization for that specific third-party disclosure exists in your consent record. (5) For teen users, confirm that each SDK's data use is consistent with your data minimization defaults and does not constitute profiling or targeted advertising under California or Texas law. Repeat this audit before shipping any new third-party integration.

The anti-conditioning rule under 16 CFR § 312.7 applies across all three tiers during onboarding and throughout the product. Every required field in your registration flow, your profile creation, and your feature access gates must map to a genuine operational necessity. Requiring a child or teen to provide a full name, school name, and profile photograph before accessing a vocabulary exercise is almost certainly a violation if that information is not actually necessary to deliver the vocabulary exercise. Audit every required field against that standard, and convert any that cannot survive the scrutiny to optional fields or remove them entirely.

Mixed-Age COPPA Compliance Checklist

EdTech platforms serving any combination of children, teens, and adults have no safe fallback in the general audience defense — the actual-knowledge standard closes that exit for most K-12 operators before they even finish reading the rule. The compliance infrastructure required under the 2025 COPPA amendments, state teen-privacy frameworks, and the anti-conditioning rule must exist before a user ever sees a registration form. Building it retroactively, after data collection has begun, creates both a remediation burden and an enforcement gap covering every user who passed through before the infrastructure was in place.

  1. Run the eight-factor test under 16 CFR § 312.2 to determine whether the platform is "directed to children." Document the analysis in writing, address each factor explicitly, and preserve the documentation as a compliance record. The FTC expects operators to have made this determination before launch, not after an inquiry.
  2. Review every school district contract, marketing material, sales deck, and customer communication for evidence that constitutes actual knowledge of minor users. A single K-8 contract is sufficient to destroy the general audience defense for COPPA purposes; inventory your documentation with that threshold in mind.
  3. Map the applicable state teen-privacy frameworks — California AADC, Texas SCOPE, South Carolina AADC — against the jurisdictions where your platform has users. Identify the strictest applicable requirement for each specific obligation and build your compliance posture to that standard. No single-state compliance approach covers a multi-state user base.
  4. Confirm which of the eight VPC methods under 16 CFR § 312.5 the platform will use for child users, then build the consent record infrastructure to document each authorization separately per third-party data disclosure. Bundled consent does not satisfy the amended rule.

Phase 2 — Technical Infrastructure

  1. Implement a Gate 1 neutral age screen before any data collection begins. Design for neutrality: no default values above 13, no visual hierarchy that advantages one age bracket, no placement at the end of a profile-creation flow. The screen's only job is to determine age category before any other field is presented.
  2. Audit every required onboarding field against the anti-conditioning rule under 16 CFR § 312.7. Each required field must map to a genuine operational necessity for delivering the core service. Fields that cannot survive that test should be made optional or removed entirely.
  3. Build a Gate 2 parental consent flow that obtains separate authorization for each third-party data disclosure before any data is collected from child users. The consent UI must present each disclosure individually — a single blanket agreement does not constitute separate authorization under 16 CFR § 312.5(a)(1)(ii).
  4. Run a full SDK audit: map every third-party SDK call against your user consent tiers and confirm that no SDK fires for an uncleared child user. Analytics tools, crash reporters, A/B testing frameworks, and customer support widgets that call out before Gate 2 is cleared are live COPPA violations for each child who reaches them.
  5. Configure data minimization defaults for teen users aged 13 through 17: profiling off, targeted advertising restricted, and data collection limited to what is necessary for the service. These defaults are not privacy-policy language; they must be enforced at the technical level within your data infrastructure.

Phase 3 — Ongoing Compliance

  1. Conduct a Data Protection Impact Assessment before shipping any new feature that reaches minor users. The DPIA requirement is operative now under South Carolina's framework and is best practice regardless of jurisdiction — it documents that each data practice is necessary and proportionate before the feature goes live, not after.
  2. Establish a recurring SDK audit cadence and run the full audit before deploying any new third-party integration into any section of the product reachable by minor users. Every new SDK is a potential unauthorized data collection point until it has been cleared against your consent tiers.
  3. Document and preserve consent records. Under 16 CFR § 312.10, each VPC authorization must be retrievable on demand. Build your record-keeping infrastructure to allow retrieval by user, by third party, and by authorization date — the audit scenario requires all three query paths.
  4. Review the state teen-privacy landscape at least annually. California, Texas, and South Carolina have all moved on teen-privacy within the past three years, and KOSA's federal trajectory adds another variable. A compliance posture that was defensible at launch can become stale within 12 months if it is not reviewed against legislative and regulatory developments.

The compliance burden here is real. A mixed-audience EdTech platform faces simultaneous obligations under a federal rule with $53,088-per-violation civil penalties, three state frameworks with divergent requirements, and a private right of action in South Carolina that creates individual officer liability for willful violations. The complexity is not an argument for delay — it is an argument for sequenced, documented implementation that creates a defensible record at every layer. Platforms that build this infrastructure correctly, and document that they built it, are in a fundamentally different posture than those responding to an FTC inquiry with a privacy policy and a checkbox.

Working through your platform's compliance posture? Promise Legal advises EdTech founders on COPPA, state teen-privacy frameworks, and data governance infrastructure. Get in touch to talk through where your platform stands.