{"id":16680,"date":"2026-04-05T10:07:01","date_gmt":"2026-04-05T17:07:01","guid":{"rendered":"https:\/\/jasonsblog.ddns.net\/?p=16680"},"modified":"2026-04-05T10:07:01","modified_gmt":"2026-04-05T17:07:01","slug":"employers-are-using-your-personal-data-to-figure-out-the-lowest-salary-youll-accept","status":"publish","type":"post","link":"https:\/\/jasonsblog.ddns.net\/index.php\/2026\/04\/05\/employers-are-using-your-personal-data-to-figure-out-the-lowest-salary-youll-accept\/","title":{"rendered":"Employers Are Using Your Personal Data to Figure Out the Lowest Salary You\u2019ll Accept"},"content":{"rendered":"\n<p>There is a nice reference to bossware tracking employees in this piece. It&#8217;s interesting that some of the megacorps questioned denied the behavior, but with recent scandals with <a href=\"https:\/\/jasonsblog.ddns.net\/index.php\/2025\/01\/03\/exposing-the-paypal-honey-influencer-scam-paypal-is-stealing-commissions\/\">PayPal and Honey<\/a>, <a href=\"https:\/\/jasonsblog.ddns.net\/index.php\/2026\/04\/04\/linkedin-data-leak-is-insane\/\" target=\"_blank\" rel=\"noreferrer noopener\">Microsoft and LinkedIn<\/a>&#8230; they lie, and morality and ethics of large corporations just aren&#8217;t there. It&#8217;s an interesting development, as they&#8217;re looking to limit the wealth they give employees at the same time they&#8217;re looking to extract as much wealth as they can from customers. And if you want to know just how bad they really are, check out this <a href=\"https:\/\/jasonsblog.ddns.net\/index.php\/2023\/02\/01\/are-you-ready-for-brain-transparency-and-ai-reading-your-mind\/\" target=\"_blank\" rel=\"noreferrer noopener\">presentation on brain transparency<\/a>, the next step up from bossware.<\/p>\n\n\n\n<p><a href=\"https:\/\/www.marketwatch.com\/story\/employers-are-using-your-personal-data-to-figure-out-the-lowest-salary-youll-accept-c2b968fb\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/www.marketwatch.com\/story\/employers-are-using-your-personal-data-to-figure-out-the-lowest-salary-youll-accept-c2b968fb<\/a><\/p>\n\n\n<div class=\"wp-block-ub-divider ub_divider ub-divider-orientation-horizontal\" id=\"ub_divider_83069ec0-ec17-4112-9400-174082891f30\"><div class=\"ub_divider_wrapper\" style=\"position: relative; margin-bottom: 2px; width: 100%; height: 2px; \" data-divider-alignment=\"center\"><div class=\"ub_divider_line\" style=\"border-top: 2px solid #ccc; margin-top: 2px; \"><\/div><\/div><\/div>\n\n\n<h5 class=\"wp-block-heading\">A growing number of employers are using surveillance wages to negotiate your next paycheck<\/h5>\n\n\n\n<p>By Genna Contino<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/images.mktw.net\/im-87186892?width=1260&amp;height=875\" alt=\"Illustration of a woman with a facial-recognition grid over her face, with two surveillance cameras and a pay stub in the background.\"\/><figcaption class=\"wp-element-caption\">Photo: MarketWatch photo illustration\/iStockphoto<\/figcaption><\/figure>\n\n\n\n<p>Algorithms are increasingly using personal data to determine the minimum pay a worker is willing to accept, consumer watchdogs say. <\/p>\n\n\n\n<p>You\u2019ve likely already felt the digital sting of \u201csurveillance pricing.\u201d It might look like an airline advertising a specific fare bundle because a customer\u2019s loyalty-program data suggests they\u2019re likely to buy it, or a website charging more for infant formula because an algorithm sensed the desperation of a new parent.<\/p>\n\n\n\n<p>We\u2019re living in a world where your purchase history, browsing speed and even your ZIP code increasingly dictate the cost of your life. And as companies get better at collecting and analyzing personal data, they aren\u2019t just gunning for the money coming out of your wallet \u2014 they\u2019re controlling how much goes into it, too.<\/p>\n\n\n\n<p>Experts describe \u201csurveillance wages\u201d as a system in which wages are based not on an employee\u2019s performance or seniority, but on formulas that use their personal data, often collected without employees\u2019 knowledge.&nbsp;<\/p>\n\n\n\n<p>Companies already try to get new hires to accept the lowest possible wage offer. But while that once meant sizing up a candidate\u2019s experience and credentials against the going market rate, it increasingly means feeding the candidate\u2019s personal data into an algorithm.<\/p>\n\n\n\n<p>According to Nina DiSalvo, policy director at labor advocacy group Towards Justice, some systems use signals associated with financial vulnerability \u2014 including data on whether a prospective employee has taken out a payday loan or has a high credit-card balance \u2014 to infer the lowest pay a candidate might accept. Companies can also scrape candidates\u2019 public personal social-media pages, she said, to determine if they are more likely to join a union or could become pregnant. The data can be used to determine wage increases after an employee is hired, and the practice can veer into discrimination, experts say.<\/p>\n\n\n\n<p>\u201cIf you\u2019re a company who\u2019s messing around with these types of practices on consumers, you\u2019re watching how well they work,\u201d said Lindsay Owens, executive director of Groundwork Collaborative, a progressive think tank. \u201cWorkers are consumers, too. If it works on consumers, it works on workers. It\u2019s the same psychology.\u201d<\/p>\n\n\n\n<p>A first-of-its-kind audit of 500 labor-management artificial-intelligence companies by Veena Dubal, a law professor at University of California, Irvine, and Wilneida Negr\u00f3n, a tech strategist, found that employers in the healthcare, customer service, logistics and retail industries are customers of vendors whose tools are designed to enable this practice. Published by the Washington Center for Equitable Growth, a progressive economic think tank, the August 2025 report identified major U.S. employers as being among these customers, including Intuit <a href=\"https:\/\/www.marketwatch.com\/investing\/stock\/intu\" target=\"_blank\" rel=\"noreferrer noopener\">INTU<\/a>, Salesforce <a href=\"https:\/\/www.marketwatch.com\/investing\/stock\/crm\" target=\"_blank\" rel=\"noreferrer noopener\">CRM<\/a>, Colgate-Palmolive <a href=\"https:\/\/www.marketwatch.com\/investing\/stock\/cl\" target=\"_blank\" rel=\"noreferrer noopener\">CL<\/a>, Amwell <a href=\"https:\/\/www.marketwatch.com\/investing\/stock\/amwl\" target=\"_blank\" rel=\"noreferrer noopener\">AMWL<\/a> and Healthcare Services Group <a href=\"https:\/\/www.marketwatch.com\/investing\/stock\/hcsg\" target=\"_blank\" rel=\"noreferrer noopener\">HCSG<\/a>.<\/p>\n\n\n\n<p>The report does not claim that all employers using these systems engage in algorithmic wage surveillance. Instead, it warns that the growing use of algorithmic tools to analyze workers\u2019 personal data can enable pay practices that prioritize cost-cutting over transparency or fairness.&nbsp;<\/p>\n\n\n\n<p>Colgate-Palmolive\u2019s director of corporate communications, Thomas DiPiazza, said the company \u201cdoes not use algorithmic wage-setting tools to make compensation decisions for our employees or to set new-hire salaries.\u201d<\/p>\n\n\n\n<p>Intuit does \u201cnot engage in such practices,\u201d a spokesperson for that company told MarketWatch.<\/p>\n\n\n\n<p>The other companies named in the report did not respond to MarketWatch\u2019s requests for comment.<\/p>\n\n\n\n<p>Surveillance wages don\u2019t stop at the hiring stage \u2014 they follow workers onto the job, too.<\/p>\n\n\n\n<p>The vendors that provide such services also offer tools that are built to set bonus or incentive compensation, according to the report. These tools track their productivity, customer interactions and real-time behavior \u2014 including, in some cases, audio and video surveillance on the job. Nearly 70% of companies with more than 500 employees were already using employee-monitoring systems in 2022, such as software that monitors computer activity, <a href=\"https:\/\/equitablegrowth.org\/research-paper\/estimating-the-prevalence-of-automated-management-and-surveillance-technologies-at-work-and-their-impact-on-workers-well-being\/#footnote-10\" target=\"_blank\" rel=\"noreferrer noopener\">according to<\/a> a survey from the International Data Corporation.<\/p>\n\n\n\n<p>\u201cThe data that they have about you may allow an algorithmic decision system to make assumptions about how much, how big of an incentive, they need to give to a particular worker to generate the behavioral response they seek,\u201d DiSalvo said.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">\u2018Judging our desperation rate\u2019<\/h3>\n\n\n\n<p>One of the clearest examples of surveillance-driven wage setting appears in on-demand healthcare staffing. A <a href=\"https:\/\/rooseveltinstitute.org\/publications\/uber-for-nursing\/\" target=\"_blank\" rel=\"noreferrer noopener\">report<\/a> put together by the Roosevelt Institute, a liberal-leaning think tank, was based on interviews with 29 gig nurses and found that staffing platforms that gig nurses use to sign up for shifts, including CareRev, Clipboard Health, ShiftKey and ShiftMed, routinely use algorithms to set pay for individual shifts.<\/p>\n\n\n\n<p>ShiftKey denied engaging in surveillance wage setting when reached by MarketWatch for comment. \u201cShiftKey unequivocally does not use any data broker services or engage in any surveillance-wage setting,\u201d said Regan Parker, the company\u2019s chief legal and public affairs officer. Parker specifically disputed claims from the Roosevelt Institute report suggesting that its platform uses workers\u2019 debt levels to determine pay, stating that ShiftKey does not use credit-card or other debt data to set wages and could not speak to the practices of other platforms.<\/p>\n\n\n\n<p>CareRev, Clipboard Health and ShiftMed did not respond to requests for comment.<\/p>\n\n\n\n<p>Rather than offering a fixed wage, the platforms adjust pay based on what they know about each worker \u2014 including how often a nurse accepts shifts, how quickly they respond to postings and what pay they have accepted in the past, according to the Roosevelt Institute report. Nurses interviewed for the report said this often resulted in nurses being paid different amounts for the same work, even within the same facility.<\/p>\n\n\n\n<p>Critics argue the system rewards workers not for skill or experience, but for what their behavior reveals about their financial vulnerability. Such systems \u201cmay determine pay by what the firm knows about how much a nurse was willing to accept for a previous assignment,\u201d the report\u2019s authors wrote, locking them into lower pay bands over time.<\/p>\n\n\n\n<p>According to Rideshare Drivers United, the union that represents rideshare drivers, algorithmic wages have been shaping pay for that industry\u2019s workers for years. Ben Valdez, a Los Angeles-based rideshare driver, said that after Uber <a href=\"https:\/\/www.marketwatch.com\/investing\/stock\/uber\" target=\"_blank\" rel=\"noreferrer noopener\">UBER<\/a> and Lyft <a href=\"https:\/\/www.marketwatch.com\/investing\/stock\/lyft\" target=\"_blank\" rel=\"noreferrer noopener\">LYFT<\/a> rolled out new pay algorithms several years ago, his earnings declined \u2014 even as post-pandemic demand rebounded. Comparing notes with other drivers, Valdez said he has seen different drivers offered different base fares for the same trip at the same time.<\/p>\n\n\n\n<p>Valdez said drivers are initially shown a take-it-or-leave-it rate, which rises only after enough drivers reject it. How that starting rate is set is opaque. \u201cWhy one driver gets a different, higher base is unknown,\u201d he said.<\/p>\n\n\n\n<p>That uncertainty is by design, according to Zephyr Teachout, a Fordham University law professor. In a 2023 report, Teachout wrote that Uber \u201cuses data-rich driver profiles to match the wage to the individual incentives of the driver and the needs of the platform,\u201d citing prior research by Dubal and reporting from The Markup.<\/p>\n\n\n\n<p>Uber said in an email to MarketWatch that its up-front fares are based on time, distance and demand conditions, and that its algorithms do not use individual driver characteristics or past behavior to determine pay. Rideshare trade association Flex, which responded after MarketWatch reached out to Lyft for comment, said in a statement that data-driven technologies \u201chelp process real-time and historical data to help match workers with a delivery or ride that represents the most efficient use of their time, which, in turn, allows them to spend more time earning.\u201d<\/p>\n\n\n\n<p>Worker advocates remain skeptical. \u201cIt\u2019s judging our desperation rate,\u201d said Nicole Moore, president of Rideshare Drivers United.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Some lawmakers are paying attention<\/h3>\n\n\n\n<p>Critics of surveillance wages argue the practice can lead to discrimination in the workplace by allowing employers to bypass traditional merit-based pay. Because these algorithms are designed to find the absolute minimum a person will accept based on their financial history and other factors, they can disproportionately target the most financially vulnerable workers.&nbsp;<\/p>\n\n\n\n<p>This creates a cycle where a person\u2019s past economic distress or personal life choices are used to justify lower pay in the present, often without the employee ever knowing which data points were used against them.<\/p>\n\n\n\n<p>\u201cWe know the concept of the glass ceiling. But at least in that concept, we\u2019ve got some visibility through that glass ceiling. We have a sense of what that world looks like. We can break it if we do the right things and galvanize,\u201d said Joe Hudicka, the author of a book called \u201cThe AI Ecosystems Revolution.\u201d \u201cThis wage-surveillance ceiling \u2014 it\u2019s iron. It\u2019s concrete. It\u2019s something that\u2019s impermeable.\u201d<\/p>\n\n\n\n<p>Legislators have been slower to address surveillance wages than surveillance pricing. New York state recently passed a rule requiring companies to disclose to consumers when their prices are set with algorithms that use their personal data \u2014 but most laws around the country are just looking at prices, not paychecks.<\/p>\n\n\n\n<p>Colorado is trying to go further. A bill introduced in the state House, titled the Prohibit Surveillance Data to Set Prices and Wages Act, would ban companies from using intimate personal data \u2014 such as payday-loan history, location data or Google <a href=\"https:\/\/www.marketwatch.com\/investing\/stock\/goog\" target=\"_blank\" rel=\"noreferrer noopener\">GOOG<\/a> search behavior \u2014 to algorithmically set what someone is paid. The bill carves out performance-based wages, meaning employers could still tie pay to measurable productivity.<\/p>\n\n\n\n<p>Rep. Javier Mabrey, a Democrat sponsoring the bill, draws a sharp line between dynamic pricing \u2014 where costs shift based on broad market conditions \u2014 and what he argues these systems actually do. \u201cWhat our bill is about is individualized price setting, which is distinct from dynamic pricing,\u201d he said. \u201cIt requires the company to pull some really personal data related to you, not supply and demand.\u201d<\/p>\n\n\n\n<p>For surveillance pay specifically, the bill would prohibit companies from using workers\u2019 personal data \u2014 without their consent \u2014 to determine what they\u2019re paid. Uber and Lyft have denied using individual driver characteristics to set wages, yet Mabrey said both companies are lobbying against the bill. \u201cWhat is the problem of codifying in law that you\u2019re not allowed to?\u201d he said.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>There is a nice reference to bossware tracking employees in this piece. It&#8217;s interesting that some of the megacorps questioned denied the behavior, but with recent scandals with PayPal and Honey, Microsoft and LinkedIn&#8230; they lie, and morality and ethics of large corporations just aren&#8217;t there. It&#8217;s an interesting development, as they&#8217;re looking to limit [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6,7],"tags":[],"class_list":["post-16680","post","type-post","status-publish","format-standard","hentry","category-tech","category-world"],"blocksy_meta":[],"featured_image_src":null,"author_info":{"display_name":"Jason","author_link":"https:\/\/jasonsblog.ddns.net\/index.php\/author\/jturning\/"},"_links":{"self":[{"href":"https:\/\/jasonsblog.ddns.net\/index.php\/wp-json\/wp\/v2\/posts\/16680","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/jasonsblog.ddns.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/jasonsblog.ddns.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/jasonsblog.ddns.net\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/jasonsblog.ddns.net\/index.php\/wp-json\/wp\/v2\/comments?post=16680"}],"version-history":[{"count":1,"href":"https:\/\/jasonsblog.ddns.net\/index.php\/wp-json\/wp\/v2\/posts\/16680\/revisions"}],"predecessor-version":[{"id":16681,"href":"https:\/\/jasonsblog.ddns.net\/index.php\/wp-json\/wp\/v2\/posts\/16680\/revisions\/16681"}],"wp:attachment":[{"href":"https:\/\/jasonsblog.ddns.net\/index.php\/wp-json\/wp\/v2\/media?parent=16680"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/jasonsblog.ddns.net\/index.php\/wp-json\/wp\/v2\/categories?post=16680"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/jasonsblog.ddns.net\/index.php\/wp-json\/wp\/v2\/tags?post=16680"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}