The previous article was about states. This one is about corporations. The argument structure is parallel: corporations have always shaped consumer behavior, and AI gives them dramatically more capable tools for doing so. What changes is the granularity, the speed, and the asymmetry between producer and target.

The basic mechanism

Most contemporary corporate AI use, from a behavior-shaping standpoint, sits in three categories.

Recommendation. AI selects what each user sees on Amazon, Netflix, Spotify, YouTube, TikTok. The selection is optimized for a corporate metric — usually some version of engagement or transaction — and the optimization is per-user, in real time. The cumulative effect is a shaped consumption profile: users encountering a stream of options calibrated to keep them transacting. This is not new in kind (catalogs and store layouts have always tried to do something similar) but is new in scale and precision.

Personalization of price and offer. AI determines, per user, what price to display, what offer to make, what bundle to suggest. Dynamic pricing, on its public face, adjusts for time of day, season, demand; on its private face, it can adjust for the user’s payment history, device type, location, and inferred willingness to pay. The Susser et al. example of an e-commerce site raising prices when the buyer’s phone battery is low is illustrative; less dramatic versions are pervasive.

Persuasive interfaces. UI choices — what is visible, what is prominent, what is one click away versus three — are now optimized in real time per user. Dark patterns — interfaces that are deliberately confusing or that exploit user attention — become dynamic dark patterns when AI tunes them. The user’s experience is personalized in directions that benefit the platform.

These three categories are not exhaustive but cover most of the empirical ground.

A specific case: Alexa

The encyclopedia’s running example is voice-assistant advertising. Reports on Amazon’s Alexa documented that voice queries — what users verbally ask their assistant — feed advertising profiles that are then shared with up to forty advertising partners.1 A user who asks Alexa about a product receives related advertisements, on other platforms, in subsequent days. The mechanism is invisible from the user’s side; they spoke aloud in their kitchen, and somewhere later they saw an ad and did not connect the two.

The case matters as an existence proof. The corporate AI infrastructure already integrates voice, browsing, location, and purchase data into unified profiles. The targeting is fine-grained. The user-side visibility is essentially zero.

Why corporate AI is structurally different from state AI

A reasonable question, given the parallels with F.35.

State AI typically aims at compliance — getting populations to behave in accordance with legal or normative frameworks. Its instruments are rewards, sanctions, and surveillance. Its visibility is uneven; some state AI use is openly discussed, some is hidden.

Corporate AI typically aims at transaction — getting individuals to buy, watch, click, share. Its instruments are recommendation, pricing, interface design. Its visibility is generally low; almost no consumer is fully aware of the AI infrastructure shaping their experience.

The two share the structural feature that the encyclopedia keeps returning to: the producer’s view of the manipulation is precise; the target’s view is, by design, absent. The asymmetry differs in flavor (compliance vs. transaction) but not in shape.

Persuasive design as a discipline

A development worth naming: persuasive design has, in the last decade, emerged as a recognized industrial discipline. Companies hire “behavioral scientists,” “growth designers,” “engagement strategists” — roles whose work is, in plain language, applied behavior-shaping. The field has academic foundations (BJ Fogg’s Persuasive Technology, 2003), industry conferences, professional associations, and a steady flow of practitioners trained in psychology and economics.

This is not, in itself, malicious. Many persuasive-design applications are benign or beneficial — encouraging exercise, reducing food waste, helping users save money. The discipline becomes a problem when it is applied at scale, with corporate metrics that may not align with user welfare, and without the safeguards that medical or financial behavior-shaping carry.

The current regulatory environment for persuasive design is uneven. Some practices (deceptive interfaces, nondisclosure of paid placement) are regulated; many (per-user pricing, attention-optimization metrics, dark patterns short of outright deception) are not. The European Digital Services Act and the AI Act move in the direction of more regulation; implementation is partial.

What can be done

Three responses recur, none alone sufficient.

User-level countermeasures. Ad blockers, tracking blockers, deliberate selection of platforms whose business models do not depend on attention maximization. These work for the small slice of users who care; they do not scale.

Regulatory transparency. Mandatory disclosure of personalized pricing, mandatory user controls over recommendation algorithms, mandatory audits of high-impact systems. The European track is making real progress; the American track lags.

Alternative business models. Subscription-based platforms whose incentives are not aligned with engagement maximization. The model works where users are willing and able to pay; it leaves a long tail of users who are not, who default back to ad-supported services with their attendant trade-offs.

The encyclopedia’s framing here, consistent with the rest of Section F: the structural asymmetry between corporations and consumers, in the AI era, is not addressable by individual vigilance. It requires political choices about what corporations are allowed to do with the cognitive infrastructure of public life. The choices are being made now, mostly by default.

The next article (F.37) takes up what happens when state and corporate uses of AI overlap.

Footnotes

  1. Alexa advertising integration reports.