Websites show you different prices than other people because pricing systems can use what they collect or infer about each shopper to shape not just the sticker price, but also the offer, ranking, urgency, discount, or treatment that person sees. Data-driven pricing does not always appear as one clean line saying your price changed. More often it shows up as a tighter offer, a worse ranking, a pushier flow, or a deal calibrated to what the system thinks you will tolerate.

That is why consumer privacy and pricing cannot be treated as separate topics. The FTC’s 2024 surveillance-pricing inquiry said the products under review can use signals such as location, browsing history, shopping history, demographics, and credit information to influence what someone is shown or charged. Even when the precise downstream effect is not always visible as a single obvious price difference, the system is still using personal data to shape economic treatment.

The infrastructure for that kind of sorting is already large. ICCL’s work on real-time bidding said data about the average person was broadcast 747 times per day in the United States and 376 times per day in Europe. The UK ICO said a single bid request can be distributed to hundreds of organizations. Once enough intermediaries can see behavioral context, consumer data stops being a passive analytics layer and starts looking like raw material for differential treatment.

The phrase digital redlining matters because it names the power imbalance. A person can be sorted into a more extractive experience without ever seeing the model, the score, or the rule that shaped it. They only experience the output: the suspiciously timed prompt, the offer that feels tighter than it should, the ranking that nudges them toward a worse choice, or the sense that the system is reading their constraints better than they can inspect its behavior.

Cloak cannot solve every form of digital inequality on its own. But it can push in the right direction. A privacy defense layer that reduces collection, weakens fingerprinting, and makes hidden pressure legible is also a defense against the consumer-facing edge of digital redlining. If people cannot see how they are being read, they cannot defend themselves.