If return visitors seem to get different treatment online, the first honest answer is that a repeat visit does not look like a fresh stranger arriving. A site may now see the same browser, the same cart, the same referral context, or the same high-intent shopper coming back. That does not automatically prove personalized pricing, but it is exactly why a changed price or changed experience feels hard to trust.

That distrust now has a real regulatory backdrop. In 2024 the FTC launched an inquiry into surveillance pricing and said companies can use location, demographics, browsing history, shopping history, and other personal data to influence how people are treated. The agency then ordered eight companies to provide information on products and services tied to that category. So when users wonder whether a return visit changed how the system saw them, they are no longer reacting to a fringe internet rumor. They are reacting to a live consumer-protection question.

The repeat-visit suspicion gets stronger because modern websites are built to preserve continuity. Princeton’s web-measurement work shows how ordinary the tracking stack has become, and EFF’s Cover Your Tracks shows how a browser can still look distinctive even when the user thinks they took a privacy step. If a site can recognize the same browser, reconnect the same cart, or carry forward the same campaign context, then a return does not look like a fresh stranger arriving. It looks like a known shopper resuming a high-intent session.

That does not mean every return-visit price jump is personalized pricing. It means the conditions for that suspicion are real. A system may know that you compared, hesitated, came back, or narrowed toward purchase. Those signals can matter even when the site does not expose them in plain language. The old Wall Street Journal reporting on Orbitz showing Mac users pricier hotels first is still useful here as a safe proof boundary: device and profile clues can influence what gets shown, even when the user cannot easily inspect the ranking logic from the outside.

The practical privacy problem is not just the number itself. It is the asymmetry. The shopper is asked to trust that the difference was innocent, while the page keeps hidden memory about the session through cookies, browser traits, referral tags, cart state, or account continuity. That asymmetry is why people open incognito tabs, switch devices, clear cookies, or ask a friend to check the same item. They are trying to break the continuity long enough to see whether the page behaves differently.

Cloak’s angle should stay grounded here. The product should not promise to prove the secret reason behind every changed price. It should reduce cheap session continuity, strip some of the tracking clues that make return visits easy to connect, and warn when a shopping flow starts combining recognition with pressure. That is a more honest answer than pretending every suspicious jump is solved or every pricing system is fully visible from the outside.