Who’s Selling Your Secrets?
In early 2014, Mike Seay received junk mail from OfficeMax that left him stunned. The envelope included the address: "Mike Seay, Daughter Killed in Car Crash." The detail wasn't just specific, it was painfully true. His 17-year-old daughter, Ashley, had died in a tragic car accident the previous year, and Seay had never disclosed this information to OfficeMax. In fact, he had no idea how the company or anyone associated with it could have learned such a deeply personal fact.
OfficeMax later explained that the mailing list had been rented from a third-party provider and issued a public apology. But that did little to reassure Mike. When he first contacted customer service, they were skeptical of his claim until he sent photographic proof of the letter. On the surface, it looked like just another marketing coupon. But the incident exposed something far more disturbing: our most personal details are being quietly harvested, traded, and sold without our knowledge or consent.
Mike Seay’s experience offers a glimpse into the hidden world of data brokers; companies specializing in collecting vast amounts of personal information from countless sources, packaging it into detailed profiles, and selling these dossiers to businesses. Even if you've never interacted directly with data brokers, it's almost certain they have already created a profile about you.
Understanding who these entities are, why they exist, and how their practices affect your life is the first step toward reclaiming control over your personal information.
What Are Data Brokers?
If data truly is the new oil, then data brokers are its most mysterious refineries: shadowy operations quietly pumping personal details from the deep wells of our digital lives, processing them behind closed doors, and selling the refined product (your private information) to anyone willing to pay.
Data brokers aren't household names. You’ve probably never heard of Acxiom, Experian, or CoreLogic, yet they know more about you than you might feel comfortable imagining. These companies lurk behind the digital curtain, never interacting directly with you, but absorbing your data nonetheless. Their business revolves around quietly gathering, packaging, and trading massive amounts of personal information about millions of people.
So how exactly do these data brokers get their hands on your personal details?
They begin simply enough, by harvesting publicly available information—voter registration lists, court records, property documents, and vehicle registrations. From there, the journey becomes increasingly murky. Every loyalty card scanned at the supermarket, every survey you've completed online, every warranty you've activated on a new appliance becomes another small puzzle piece in a much larger picture. Brokers eagerly purchase these snippets of personal data from other companies eager to monetize customer information.
But where the story really takes a turn for the unsettling is online.
The internet has become a digital minefield, peppered with invisible tracking technologies, quietly working to capture your every click, hover, and scroll. Cookies—those tiny digital markers left behind on your computer—record not just which sites you've visited, but how long you stayed, what you clicked, and even items you almost bought but decided against at the last moment.
Then there are pixel trackers, nearly invisible images that live embedded in web pages or email newsletters. They quietly alert data brokers whenever you open an email, click a link, or load a website. And more troubling still is fingerprinting; an advanced form of tracking that identifies your specific device by its unique combination of software settings, fonts, and screen resolution. Even if you delete cookies or use incognito browsing, fingerprinting technology can still spot you in a digital crowd.
The scale at which these data brokers operate is dizzying.
For instance, Acxiom alone has boasted about processing upwards of 50 trillion transactions in a single year. They aggregate these endless streams of data into detailed consumer profiles, sometimes containing thousands of individual data points per person—everything from your income level, marital status, and shopping habits to inferred characteristics about your health status, religious beliefs, sexual orientation or political leanings.
These profiles aren't just comprehensive; they're disturbingly precise.
Data brokers package people into neat demographic boxes with labels like "Rural Everlasting," "Urban Scramble," or "Diabetes Interest." These categories are not simply academic as they are monetized. Companies, political campaigns, insurance providers, and banks all pay handsomely to target these highly specific groups with tailored advertising, policy offers, or even political messaging.
Perhaps most troubling is how powerless individuals are when it comes to controlling their own data. Although data brokers technically offer opt-out mechanisms, actually navigating these processes can feel like deliberately designed labyrinths. Instructions are often vague, requests can take weeks to be acknowledged, and, even then, your data might persist or swiftly reappear under a slightly altered name or address. It's a game of whack-a-mole played at your expense, and rarely can you win outright.
This asymmetry isn’t accidental. It’s fortified by opaque business practices and aggressive lobbying that keep consumer privacy laws weak, fragmented, and full of loopholes. These companies quietly profit from an imbalance of knowledge: they know almost everything about you, while you know almost nothing about them.
Data brokers embody the most troubling side of our digital age—a hidden industry thriving in obscurity, commodifying our personal lives with little oversight and even less consent. But just as troubling as what they already do is how new technology threatens to make it even worse.
AI Makes It Worse
Data brokers have long inferred sensitive attributes about people, from health status and political affiliations to financial stability. But AI supercharges these inferences by processing astonishing volumes of data. Where humans see endless rows of data and spreadsheets, AI sees patterns, relationships, and possibilities. Algorithms rapidly uncover hidden correlations invisible to humans, allowing data brokers to actively predict and influence your next moves.
Take Acxiom, a major data broker that reportedly processes as many as 50 trillion data transactions in a single year. Imagine what an AI trained on such immense datasets might reveal about individuals, not just demographic details like age or address, but deeply intimate traits, such as emotional states, stress levels, or mental health indicators.
AI can weave together disparate threads from seemingly innocuous activities. Your recent online searches, browsing patterns, spending habits, and even subtle shifts in your language on social media infer your mood or predict major life events, such as marriage, pregnancy, or illness. The more these models learn, the more precisely they can anticipate your vulnerabilities, desires, and motivations, enabling personalized marketing campaigns that feel uncannily tailored. On the surface, this might sound convenient, perhaps even beneficial. But beneath the personalized recommendations lurks a troubling risk: exploiting emotional or financial vulnerabilities at precisely the moment you're most susceptible.
For instance, an insurance company could use AI-powered risk assessments that scrutinize your social media activities and location history, subtly increasing your premium because the algorithm predicts you're more accident-prone or emotionally unstable. Similarly, banks might deploy algorithms trained on AI-derived consumer profiles to automatically reject loan applications without transparency or human review, locking out entire segments of society based on questionable correlations found deep within massive datasets.
The issue of bias and fairness becomes especially pronounced with AI. Algorithms learn from historic data, absorbing and amplifying existing prejudices. A model trained on historical loan approvals might perpetuate systemic discrimination against certain communities, effectively automating injustice at scale. Worse still, the decision-making logic of these sophisticated AI systems is typically locked behind proprietary secrecy—so-called "black boxes." When your application for credit or insurance is denied, companies often refuse to disclose precisely why, claiming that revealing their algorithms would compromise trade secrets.
This secrecy leaves consumers helpless, unable to challenge unfair or inaccurate assessments that shape their lives.
Real Lives, Real Harm
For some, the effects of data brokers’ hidden influence can be deeply unsettling and even life-changing. Mike Seay’s experience with OfficeMax is not the only cautionary tale.
Thomas Robins, a young Virginian, discovered that while looking for a job, he had been profiled inaccurately by a people-search website called Spokeo. The site listed him as decades older than he was and incorrectly claimed he was married with children. If a prospective employer happened to see those discrepancies, it could easily raise questions about his honesty and cost him the opportunity. Robins sued Spokeo under the Fair Credit Reporting Act, arguing that the company’s data should be treated like a credit report, with obligations to maintain accuracy and to inform individuals about how their information is being used. Though the case did not ultimately redefine industry practices, it spotlighted the problem of incorrect data harming innocent people.
Another example comes from journalist Gillian Brockell, who chronicled her pregnancy on social media. Algorithms swiftly tagged her as an expectant mother, and ads for baby products soon flooded her feeds. When she tragically lost her baby, those ads persisted, reminding her over and over of her loss. Although she posted publicly about what had happened, the data-driven advertising engines kept serving her the same cheerful promotions, underscoring how automated profiling can completely ignore critical human context.
Both stories show that our personal data isn’t merely collected, it can be acted upon, sometimes in ways that feel careless or hurtful. Whether the result is denial of a job interview or an emotionally distressing ad, data-driven systems carry genuine weight in people’s personal and professional lives. Understanding how these systems function helps illuminate why they need oversight and why individuals should take steps to safeguard their own data.
What You Can Do
It’s easy to feel overwhelmed by the scale and complexity of digital surveillance. The systems harvesting, analyzing, and monetizing your personal data operate quietly, often invisibly, in the background of your daily life. But that doesn’t mean you’re powerless. There are meaningful steps individuals and organizations can take to push back against the imbalance and begin to reclaim control.
For individuals, the first step is awareness. Much of what fuels data collection is voluntary disclosure. We routinely give away intimate details like birthdates, health conditions, political views—without realizing their value or risk. Limiting what you share online, especially on social media, can make a significant difference. Be cautious with forms that ask for sensitive information like your Social Insurance Number (SIN), and resist the urge to trade personal details for small conveniences or discounts.
Beyond what you share, how you browse also matters. Privacy-focused tools can help reduce your digital footprint. Consider using browsers like Safari, DuckDuckGo, or Brave with strong privacy settings enabled. Install trusted browser extensions that block trackers and prevent fingerprinting. Using a reputable Virtual Private Network (VPN) can help mask your location and browsing activity—especially when connected to public Wi-Fi. If you use Apple products, consider turning on Private Relay, which helps shield your IP address from prying eyes. Switching to privacy-conscious search engines like DuckDuckGo limits the amount of personal information tied to your queries.
And finally, ask questions—especially when AI enters the picture. Consent in the digital world is often assumed, buried in terms and conditions, or asked in passing. But when you’re faced with a question like “Are you okay if we use AI to transcribe our conversation?” it’s okay to say no. Don’t be afraid to pause and ask: Where does this data go? Who sees it? Can I opt out? By pushing back, even gently, you assert your rights and remind organizations that consent must be informed, not assumed.
These changes may seem small, but collectively they help you regain some agency in an environment designed to profile you by default.
Still, even the most careful user can’t fully escape data extraction. That’s why individual action, while important, isn’t enough. Real change requires shifts in the systems and incentives that drive data collection, starting with the organizations that benefit most from it.
Marketers, in particular, sit at a critical crossroads.
You’re not just collecting data—you’re shaping the digital experiences that define how people move through the online world. The data you gather holds immense potential: not just to personalize, but to influence, persuade, and shape behaviour.
With that power comes responsibility.
Relying solely on compliance with current laws is no longer sufficient. Instead, marketers must adopt a more ethical, intentional approach to data stewardship..
Begin by auditing your data collection practices. Ask yourself: are we gathering information simply because we can, or because it serves a clear and justified business purpose? Only collect what’s necessary. Avoid hoarding data “just in case”—doing so not only increases risk but erodes customer trust.
Next, take a hard look at data storage and access. Know exactly where your data is hosted, and who can see it. This is particularly critical when data crosses borders. Canadian businesses that store data in the U.S., for example, expose their customers to foreign surveillance laws. Limit access internally to only those who need it, and implement regular audits to ensure access controls remain tight and up to date.
Transparency should be at the heart of your data strategy. Communicate clearly with your customers: What are you collecting? Why? How will it be used? When customers understand your intentions—and see their data being used to improve their experience—they’re far more likely to engage and trust your brand.
Responsible data management isn’t just about avoiding fines, it’s about protecting people. And in doing so, you protect your brand, your business, and the integrity of the digital economy itself.
The Takeaway
Behind every data point is a human being. Your personal information (your health, finances, habits, beliefs) is not just bits and bytes. It’s a detailed portrait of your life. And in the hands of data brokers, it becomes a product that is collected, analyzed, and sold without your knowledge.
These aren’t passive records sitting on a server somewhere. They’re active profiles that influence everything from the ads you see to the prices you pay, and even whether you’re approved for a loan or offered a job.
The impact is real, and often invisible.
Canadians: pay attention. Much of our data is managed by U.S.-based brokers, meaning it’s governed by American laws, not Canadian ones. Our personal information routinely crosses borders, where Canadian privacy standards no longer apply. This isn't just a privacy issue; it's a question of national autonomy.
When foreign corporations control the data of millions of Canadians, it undermines our economic independence and increases the risk of misuse or exploitation. Canadian consumers and businesses alike are left vulnerable, and our ability to enforce privacy standards is weakened.
Canada’s proposed Consumer Privacy Protection Act (CPPA) is designed to address these gaps. It introduces clearer rules around data localization, consent, storage, and cross-border transfers. By requiring Canadian data to stay within Canadian jurisdiction, it helps reclaim control over one of the country’s most valuable assets.
As the 2025 federal election approaches, privacy and data sovereignty must be part of the conversation. Supporting candidates and policies that strengthen these protections is essential. Not only for individual rights, but for the long-term security and independence of the country.
Data is power. Right now, that power is unevenly distributed.
It's time we changed that.
Sources
Brill, Julie. “A Call for Transparency and Accountability.” Federal Trade Commission, 27 May 2014, www.ftc.gov/public-statements/2014/05/call-transparency-accountability. Accessed 25 Mar. 2025.
Brockell, Gillian. “Dear Tech Companies: I Don’t Want to See Pregnancy Ads After My Child Was Stillborn.” The Washington Post, 11 Dec. 2018, www.washingtonpost.com/opinions/2018/12/11/dear-tech-companies-i-dont-want-see-pregnancy-ads-after-my-child-was-stillborn. Accessed 25 Mar. 2025.
Kanwal, Rahul, and Kevin Walby. Tracking the Surveillance and Information Practices of Data Brokers: A Report. University of Winnipeg, www.uwinnipeg.ca/caij. Accessed 25 Mar. 2025.
Pearce, Matt. “OfficeMax Apologizes for Letter Addressed to ‘Daughter Killed in Car Crash’.” Los Angeles Times, 19 Jan. 2014, www.latimes.com/nation/nationnow/la-na-nn-officemax-mail-20140119-story.html. Accessed 25 Mar. 2025.
Privacy Commissioner of Canada. “Data Brokers and PIPEDA: An Overview of Privacy Concerns.” Office of the Privacy Commissioner of Canada, 2019, www.priv.gc.ca. Accessed 25 Mar. 2025.
“Spokeo, Inc. v. Robins.” Wikipedia: The Free Encyclopedia, Wikimedia Foundation, en.wikipedia.org/wiki/Spokeo%2C_Inc._v._Robins. Accessed 25 Mar. 2025.
“The Data Brokers: Selling Your Personal Information.” CBS News, www.cbsnews.com/news/the-data-brokers-selling-your-personal-information/. Accessed 25 Mar. 2025.
“Will Canada Pass Bill C-27?” International Association of Privacy Professionals (IAPP), iapp.org/news/a/ahead-of-2025-federal-election-will-canada-pass-bill-c-27-/. Accessed 25 Mar. 2025.