Why PMax strategy is now input-driven
Google’s Performance Max campaigns have quietly shifted from a “set it and forget it” automation play to something that rewards careful, deliberate input design. The April 2026 update to Google’s PMax developer documentation [1] reinforced what practitioners have been learning the hard way: the algorithm’s ceiling is determined by the quality of what you feed it, not by how cleverly you tweak bids after launch.
This is the operational reality of PMax in 2026. You cannot outmaneuver the system with bid adjustments or placement exclusions the way you could with legacy campaign types. Your levers are asset groups, audience signals, feed quality, and budget allocation. Google’s own documentation frames PMax as a complement to Search campaigns that respects exact and phrase match keywords [11], which means the system is designed to fill gaps, not replace your existing structure. But filling gaps well requires you to define the boundaries clearly.
I’ve spent the last several months auditing PMax accounts across ecommerce and lead gen verticals, and the pattern is consistent: accounts that treat PMax as a black box get black-box results, meaning unpredictable ROAS swings and opaque spend allocation. Accounts that invest in structured inputs see more stable, interpretable performance. The question for 2026 is not whether to run PMax, but how precisely you can define the inputs that shape its behavior.
Structure your asset groups for relevance
Asset groups are the single clearest structural control inside PMax, and weak organization here creates mixed signals that dilute performance across every channel Google serves [3]. Each asset group bundles headlines, descriptions, images, videos, and listing groups into a themed unit. When that theme is coherent, the algorithm can match creative to intent efficiently. When it isn’t, you’re asking Google’s AI to figure out which of your 15 headlines pairs with which of your 20 images for which audience segment, with no thematic guardrails.
Google recommends filling each asset group to capacity: 15 headlines, 5 descriptions, 20 images across square, market, and portrait formats, and 5 videos [7]. That recommendation is about combinatorial variety, giving the system enough raw material to assemble high-performing ad combinations. But volume without thematic discipline is counterproductive. A headline about luxury watches paired with an image of budget earrings, served to someone browsing engagement rings, is the kind of mismatch that unthemed asset groups produce.
How you segment depends on your catalog size and business model. Savvy Revenue’s PMax guide [2] walks through a useful spectrum: a reading glasses store with a narrow catalog can run a single asset group effectively, while a hair products store with 300 SKUs benefits from grouping by customer problem (like “regrowth” or “damage repair”), and a jewelry retailer with 10,000+ products needs broader parent-category groupings (Jewelry, Watches, Accessories). The principle is that each asset group should represent a distinct purchase intent, not just a product taxonomy bucket.
For lead gen accounts, the segmentation axis shifts from product category to service type or audience tier. An enterprise SaaS company might separate asset groups by use case (security, compliance, analytics) with each group carrying messaging and imagery tailored to that use case’s buyer persona. Listing groups within ecommerce asset groups should enforce zero product overlap, because overlapping listings force the algorithm to compete against itself [3].
One practical workflow that pays off: query the asset_coverage field on the AssetGroup resource via GAQL to pull Ad Strength ratings and the specific ad_strength_action_items Google recommends [1]. These action items tell you exactly which asset types are missing or underperforming. When you replace low-rated assets, do it after at least two to three weeks of data, and never remove an asset without adding a replacement first, because gaps in coverage reduce your placement eligibility [9].
Use audience signals to guide the algorithm
Audience signals in PMax are suggestions, not targeting constraints [6]. Google’s system will expand beyond whatever signals you provide if it finds conversion opportunities elsewhere. This distinction trips up advertisers who expect audience signals to function like the audience targeting in Display or Video campaigns. They don’t. They’re starting points for the algorithm’s exploration, and the quality of those starting points determines how quickly the system finds profitable segments.
First-party data is the highest-value signal you can provide. Customer lists, email segments, and remarketing audiences give the algorithm a concrete profile of who converts for your business. Savvy Revenue’s guidance is blunt on this point: prioritize your own customer and email lists and treat Google’s predefined audiences as secondary [2]. In my experience, accounts that lead with Customer Match lists and high-value purchaser segments see the algorithm reach productive performance faster than accounts that rely on in-market or affinity audiences alone.
That said, there’s a genuine tension in the practitioner community about how much audience signals actually influence PMax behavior. Some agency practitioners have observed that signals do less than expected, with the algorithm quickly expanding beyond them regardless of how precisely they’re defined [8]. Google’s own documentation confirms that signals are non-restrictive [11], which means the system can and will ignore them when it identifies better opportunities. One vendor claims optimized signals improve ROAS by 15-25% within 21 days [12], but no Google-controlled study backs that figure, and I’d treat it as directional at best.
Where signals matter most is in alignment with asset groups. Pairing enterprise-focused creative with a Customer Match list of enterprise accounts and custom segments built around enterprise-intent search terms creates a coherent signal that the algorithm can act on immediately [6]. When creative messaging and audience signal point in the same direction, the algorithm finds its footing faster. When they point in different directions, you’re burning budget on the learning phase while the system resolves the contradiction.
A layering approach works well in practice: start with first-party data as the primary signal, then add in-market and custom intent segments after the campaign has accumulated initial conversion data. Avoid over-managing signals with aggressive exclusions early on, because the system needs room to explore [2]. You can tighten later once you have enough data to identify which expansion paths are wasteful.
Measure PMax success beyond conversion volume
Raw conversion volume is the laziest way to evaluate PMax, and it’s the metric most likely to mislead you. PMax campaigns can cannibalize branded search traffic, claim credit for conversions that would have happened organically, and inflate volume by chasing low-value micro-conversions. If your only success metric is “did conversions go up,” you’re not measuring PMax performance; you’re measuring Google’s ability to take credit.
Agency analyzes consistently find that two to three asset groups drive 70-80% of revenue in most PMax campaigns [12]. That concentration means you need asset-group-level reporting to understand where value actually comes from, and you need to be willing to pause or restructure the groups that consume budget without proportional return. Google’s Ad Strength metric and asset coverage reports [1] provide some visibility, but they measure input quality (creative completeness and variety), not output quality (actual revenue contribution).
The metrics that matter more are incremental ROAS, new customer acquisition rate, and channel-level spend distribution. PMax’s cross-channel nature means it can shift spend toward Display or YouTube placements that look efficient on a last-click basis but don’t drive incremental demand. If your PMax campaign is spending 60% of its budget on Display and your Display-attributed conversions aren’t showing up in GA4’s cross-channel reports, that’s a signal worth investigating. Google allows up to 100 PMax campaigns per account [11], but consolidation generally produces better results because it gives the algorithm more data to work with per campaign. The tradeoff is that consolidation reduces your ability to isolate performance by segment.
PMax experiments, which Google introduced to let advertisers A/B test campaign configurations, are the most reliable way to measure whether a structural change (new asset group, different signal, adjusted budget) actually improves outcomes [9]. Running changes without experiments means you’re confounding your optimization with seasonal trends, competitive shifts, and the algorithm’s own learning cycles.
Implement brand safety and negative keywords
PMax’s default behavior is expansive, and that expansion includes final URL replacement. With final URL expansion enabled (which it is by default), Google can dynamically swap your landing pages and generate ad copy based on what it thinks matches user intent [1]. For some advertisers, this works well because it surfaces relevant pages that weren’t explicitly included in the campaign. For others, it sends traffic to blog posts, support pages, or product pages with poor conversion rates.
You can opt out of final URL expansion via the Campaign.final_url_expansion_opt_out setting (in API versions before v22) or through asset automation settings in v22 and later [1]. Whether you should opt out depends on your site architecture. If your site has a clean conversion funnel with well-optimized landing pages, opting out keeps traffic where you want it. If your site has hundreds of product pages and you can’t predict which ones will convert best for long-tail queries, leaving expansion on may surface opportunities you’d miss otherwise.
Negative keywords in PMax have historically been a pain point, requiring Google rep intervention to apply account-level negatives. That process has improved somewhat, but it remains less flexible than Search campaign negatives. Brand term exclusions are particularly important: without them, PMax will happily spend budget on branded queries that your Search campaigns already cover, inflating PMax’s reported performance while cannibalizing cheaper branded clicks [3].
Placement exclusions are another control worth using. PMax serves ads across YouTube, Display, Gmail, Discover, Maps, and Search, and not every placement is appropriate for every brand. Reviewing where your ads appear (through the Insights tab and placement reports) and excluding low-quality or irrelevant placements is basic hygiene that too many advertisers skip. The 20 controls that ALM Corp documents in their optimization guide [3] include several placement and content exclusion options that are worth auditing quarterly.
Prepare for upcoming AI-driven PMax features
Google’s trajectory with PMax is clear: more automation, more AI-generated assets, and more cross-channel optimization with less manual control. Asset automation settings already enable auto-generated text, videos, and headlines [1], and Google recommends enabling them for maximum performance [11]. The practical question is whether auto-generated assets match your brand standards and messaging requirements.
In my testing, auto-generated text assets tend to be serviceable but generic. They pull from your landing page content and produce headlines and descriptions that are grammatically correct and topically relevant but lack the specificity or voice that distinguishes strong creative from adequate creative. For brands with strict messaging guidelines, this is a problem. For performance-focused advertisers who care more about coverage than brand consistency, it’s a reasonable tradeoff that fills gaps in asset groups that would otherwise have incomplete coverage.
Auto-generated video is a bigger gamble. Google can assemble video assets from your images and text, but the results often look like what they are: automated slideshows. If YouTube is a meaningful channel for your PMax campaigns, investing in purpose-built video assets will almost certainly outperform auto-generated ones. The gap between “has a video” and “has a good video” is significant in terms of engagement and conversion rates on YouTube placements.
What I’d watch for in the second half of 2026 is deeper integration between PMax and Google’s generative AI capabilities. Google has been steadily expanding Gemini’s role across its ad products, and PMax is a natural home for AI-generated creative at scale. If Google begins offering real-time creative generation that adapts messaging to individual user contexts (rather than just assembling pre-built assets), the value of your input signals, your audience data, your brand guidelines, your product feed quality, will increase, not decrease. Automation doesn’t reduce the importance of inputs; it amplifies the consequences of getting them right or wrong.
Advertisers who build clean, well-structured PMax campaigns now, with tight asset group themes, strong first-party audience signals, and rigorous measurement practices, will be better positioned to adopt whatever Google ships next. Advertisers who treat PMax as a dump-and-pray channel will find that the algorithm’s expanding capabilities simply magnify their existing structural problems at greater scale and speed.
Sources
- Performance Max Optimizations | Google Ads API
- Performance Max Setup: The Ideal Guide (From a PMax Skeptic)
- Google Ads Performance Max Optimization: 20 Controls That…
- Performance Max Creative Specs: Sizes, Formats, and Best Practices
- Performance Max Assets: Types, Best Practices and Requirements
- Audience signals in Performance Max are doing less than you think
- 6 Ways to Optimize Performance Max Campaigns (2026)
- Set up Performance Max campaign: Ultimate guide in 2026
- About Performance Max campaigns
- Google Ads PMax Optimization with Claude AI 2026

