A2P 10DLC, plainly explained.

A2P 10DLC is the messaging compliance regime that decides whether your SMS gets delivered. Most teams treat it like a form. It isn't — it's a brand-vetting process. Here's what's actually going on and the playbook I use to take volume agencies from ~20% approval to consistently 95–100%.

If you've worked with SMS at any scale on a CRM platform in the last few years, you've run into A2P 10DLC. If you haven't, here's the short version: the US carriers got tired of marketing texts pretending to be from a friend. So they put a registration regime on top of the 10-digit long codes that businesses use to send messages. You register your business (the "Brand"), you register the use case (the "Campaign"), it gets vetted, and your traffic either gets delivered, throttled, or filtered into the void.

The part nobody tells you is that A2P registration is not a form. It's a brand-vetting process. The form is the surface. The thing being graded is the legitimacy and consistency of your business identity. The companies that treat it as a one-shot data-entry exercise lose at this. The ones that treat it as a brand-coherence exercise win.

I'll go through the four moving parts (Brand, Campaign, Sample Messages, Opt-in), the reasons submissions fail, and a short runbook I use with volume agencies. By the time you're done reading you should be able to look at a rejected campaign and know within sixty seconds why it failed.

The four moving parts

1. The Brand

The Brand is the business identity that owns the messages. Legal name, EIN, registered address, website, contact info. The vetting layer behind the scenes (TCR) checks these against public records. If your EIN doesn't tie cleanly to the legal name, or your address is a forwarding service, or your website doesn't exist or doesn't match the business, you'll get rated as low-trust or rejected outright.

The most common Brand-level failure I see: a client uses a "doing business as" name for their website and customer-facing brand, but registers A2P with the LLC name. The two don't line up in records. The Brand fails.

2. The Campaign

The Campaign is the use case. You declare what kind of messages you'll be sending (marketing, account notifications, two-factor auth, etc.), how many per day, who you're sending to, and how the recipients got into your list.

This is where most rejections actually happen. Why? Because people write the Campaign description as a summary of "what we sell," not as a description of "what messages get sent and how recipients ended up receiving them." The reviewer wants to see a clean, end-to-end story: someone opts in here, that's the explicit consent, then they get these messages with this kind of content, and they can opt out by replying STOP.

3. Sample Messages

You provide sample messages that match the Campaign use case. If you registered a Campaign as "appointment reminders" but your sample message says "Hey John, our Black Friday sale starts tomorrow!", the campaign gets rejected. Match samples to use case, full stop.

Include the brand name in every sample. Include opt-out language. If you have an "info" link, include it. Don't truncate. Don't write samples in a tone that sounds like a marketing pitch unless the Campaign is registered as marketing.

4. Opt-in evidence

Opt-in is the part everyone underestimates. You need to describe (and ideally show) how consumers consent to receive your messages. A web form? Where is it? A keyword? What's the keyword? A point-of-sale signup? Describe it.

Vague opt-in descriptions are the single most common reason for "we need more info" responses from the carriers. "Customers opt in through our website" is not an answer. "Customers fill out a contact form at example.com/contact with an explicit, unchecked-by-default checkbox reading 'I agree to receive SMS messages from BRAND about my appointments. Msg & data rates may apply. Reply STOP to cancel.'" — that's an answer.

Why most submissions fail

I keep a mental tally of why A2P submissions get rejected. Sorted by frequency:

  1. Brand-Campaign mismatch. The Brand says one thing about the business, the Campaign use case says another, the website backs up neither cleanly.
  2. Vague opt-in language. "Customers consent via our website" — not enough.
  3. Sample messages that don't match the registered use case. Marketing samples in an account-notifications campaign.
  4. Missing opt-out language in the sample messages.
  5. Public-records mismatch on Brand (wrong EIN, address, or legal name).
  6. Throughput requests that don't match the use case. Asking for 200 MPS on a low-volume notification campaign.

That's basically the list. Almost every rejection I've ever seen falls into one of those buckets.

The runbook

For volume agencies (clients running A2P for dozens or hundreds of sub-accounts), I built a standardized process. The numbers were ugly when we started — somewhere around 20% approval rate, with the rest stuck in a loop of rejections and resubmissions. After standardizing the runbook, the same teams now sit in the 95–100% range.

The whole thing isn't a secret, just discipline. The core moves:

  1. Brand pre-flight. Before submitting anything, verify the business's legal name, EIN, and address against public records. If any of those don't line up, fix them first. Don't submit.
  2. One Campaign template per use case. Don't write the Campaign from scratch each time. Maintain a library of approved Campaign descriptions and sample-message sets for each common use case (marketing, appointment reminders, lead follow-up, etc.). Reuse and adapt.
  3. Opt-in script for the client. Most clients don't think about opt-in at all. I send them a one-page doc — a checklist of where their opt-in lives, the exact consent language to use, and how to describe it on the form. We update the live form if it's missing anything before submission.
  4. Email-parser tracking. A2P status emails are noisy. I built an email-parsing workflow that pipes inbound status updates into a Google Sheet so we can see, at a glance, which submissions are pending vs. approved vs. rejected and why. (Highly recommended once you're past a handful of campaigns.)
  5. Reject-pattern library. Every rejection we get goes into a shared doc with the rejection reason and what we changed to get the resubmission approved. After a few months, the library starts catching almost everything before submission.
// the bit nobody likes to admit

A2P is bureaucracy. It rewards rigor. The teams that succeed at it treat it like a process, not like an obstacle. The teams that fail at it keep trying to "submit and hope."

Operating around the system

Two things to remember once you're approved:

Brand and Campaign are not set-and-forget. If your business name or website changes — even slightly — your Brand can drift out of compliance. If your sample messages start drifting from the use case you registered, your filtering rates will quietly creep up. Set a reminder to audit your registered details once a quarter against what you're actually sending.

Carrier filtering doesn't always show up as a hard reject. You can be "approved" and still have a significant percentage of messages filtered if your sender reputation drops or if your message content trips a filter. Treat deliverability as an ongoing metric, not as a one-time gate you cleared.

If you remember one thing

If you do nothing else: treat A2P as a brand-coherence exercise, not as a form. The legal name, the EIN, the address, the website, the Campaign use case, the sample messages, the opt-in story — they all have to tell the same story about the same business. When they do, you get approved. When they don't, you don't. It's that simple, and it's that hard.


If you're stuck on a specific A2P rejection, or you're running A2P at volume and want a sanity check on your process, drop me a line. Always happy to look at a rejection email and tell you what the reviewer probably saw.