Your Perfect Assignment is Just a Click Away

Starting at $8.00 per Page

100% Original, Plagiarism Free, Customized to Your instructions!

glass
pen
clip
papers
heaphones

Module 5 Article Summary: Dynamic Pricing and Bias

Module 5 Article Summary: Dynamic Pricing and Bias

Dynamic Pricing and Bias

After reading the article, How targeted ads and dynamic pricing can perpetuate bias, in the Module 5: Lecture Materials & Resources, write a detailed summary on Dynamic Pricing and Bias.

Submission Instructions:

  • The paper is to be clear and concise and students will lose points for improper grammar, punctuation, and misspelling.
  • The paper is to be 300 words in length, current APA style, excluding the title, abstract and references page.
  • Incorporate a minimum of 2 current references (published within the last five years) scholarly journal articles or primary legal sources (statutes, court opinions) within your work.
  • Complete and submit the assignment by 11:59 PM ET on Sunday.
  • Late work policies, expectations regarding proper citations, acceptable means of responding to peer feedback, and other expectations are at the discretion of the instructor.
  • You can expect feedback from the instructor within 48 to 72 hours from the Sunday due date.

——————————————————————————————————————————————

Harvard Business Review HomeMarketing   |   How Targeted Ads and Dynamic Pricing Can Perpetuate BiasSubscribeSign In

DiversityLatestMagazinePopularTopicsPodcastsVideoStoreThe Big IdeaVisual LibraryCase SelectionsYou have 1 free article left this month.Create an account to read 2 more.Marketing

How Targeted Ads and Dynamic Pricing Can Perpetuate Bias

by

and

November 08, 2019Summary.   In new research, the authors study the use of dynamic pricing and targeted discounts, in which they asked if (and how) biases might arise if the prices consumers pay are decided by an algorithm. Suppose your company wants to use historical data to train an algorithm to identify customers who are most…   more

In theory, marketing personalization should be a win-win proposition for both companies and customers. By delivering just the right mix of communications, recommendations, and promotions — all tailored to each individual’s particular tastes — marketing technologies can result in uniquely satisfying consumer experiences.

While ham-handed attempts at personalization can give the practice a bad rap, targeting technologies are becoming more sophisticated every day. New advancements in machine learning and big data are making personalization more relevant, less intrusive, and less annoying to consumers. However, along with these developments come a hidden risk: the ability of automated systems to perpetuate harmful biases.

In new research, we studied the use of dynamic pricing and targeted discounts, in which we asked if (and how) biases might arise if the prices consumers pay are decided by an algorithm. A cautionary tale of this type of personalized marketing practice is that of the Princeton Review. In 2015, it was revealed that the test-prep company was charging customers in different ZIP codes different prices, with discrepancies between some areas reaching hundreds of dollars, despite the fact that all of its tutoring sessions took place via teleconference. In the short term, this type of dynamic pricing may have seemed like an easy win for boosting revenues. But research has consistently shown that consumers view it as inherently unfair, leading to lower trust and repurchasing intentions. What’s more, Princeton Review’s bias had a racial element: a highly publicized follow-up investigation by journalists at ProPublica demonstrated how the company’s system was, on average, systematically charging Asian families higher prices than non-Asians.

INSIGHT CENTER
  • AI and Bias
    Building fair and equitable machine learning systems.

Even the largest of tech companies and algorithmic experts have found it challenging to deliver highly personalized services while avoiding discrimination. Several studies have shown that ads for high-paying job opportunities on platforms such as Facebook and Google are served disproportionately to men. And, just this year, Facebook was sued and found to be in violation of the Fair Housing Act for allowing real estate advertisers to target users by protected classes, including race, gender, age, and more.

What’s going on with personalization algorithms and why are they so difficult to wrangle? In today’s environment — with marketing automation software and automatic retargeting, A/B testing platforms that dynamically optimize user experiences over time, and ad platforms that automatically select audience segments — more important business decisions are being made automatically without human oversight. And while the data that marketers use to segment their customers are not inherently demographic, these variables are often correlated with social characteristics.

To understand how this works, suppose your company wants to use historical data to train an algorithm to identify customers who are most receptive to price discounts. If the customer profiles you feed into the algorithm contain attributes that correlate with demographic characteristics, the algorithm is highly likely to end up making different recommendations for different groups. Consider, for example, how often cities and neighborhoods are divided by ethnic and social classes and how often a user’s browsing data may be correlated with their geographic location (e.g., through their IP address or search history). What if users in white neighborhoods responded strongest to your marketing efforts in the last quarter? Or perhaps users in high-income areas were most sensitive to price discounts. (This is known to happen in some circumstances not because high-income customers can’t afford full prices but because they shop more frequently online and know to wait for price drops.) An algorithm trained on such historical data would — even without knowing the race or income of customers — learn to offer more discounts to the white, affluent ones.

To investigate this phenomenon, we looked at dozens of large-scale e-commerce pricing experiments to analyze how people around the United States responded to different price promotions. By using a customer’s IP address as an approximation of their location, we were able to match each user to a US Census tract and use public data to get an idea of the average income in their area. Analyzing the results of millions of website visits, we confirmed that, as in the hypothetical example above, people in wealthy areas responded more strongly to e-commerce discounts than those in poorer ones and, since dynamic pricing algorithms are designed to offer deals to users most likely to respond them, marketing campaigns would probably systematically offer lower prices to higher income individuals going forward.

What can your company can do to minimize these socially undesirable outcomes?  One possibility for algorithmic risk-mitigation  is formal oversight for your company’s internal systems. Such “AI audits” are likely to be complicated processes, involving assessments of accuracy, fairness, interpretability, and robustness of all consequential algorithmic decisions at your organization.

While this sounds costly in the short term, it may turn out to be beneficial for many companies in the long term.  Because “fairness” and “bias” are difficult to universally define, getting into the habit of having more than one set of eyes looking for algorithmic inequities in your systems increases the chances you catch rogue code before it ships. Given the social, technical, and legal complexities associated with algorithmic fairness, it will likely become routine to have a team of trained internal or outside experts try to find blind spots and vulnerabilities in any business processes that rely on automated decision making.

As advancements in machine learning continue to shape our economy and concerns about wealth inequality and social justice increase, corporate leaders must be aware of the ways in which automated decisions can cause harm to both their customers and their organizations. It is more important than ever to consider how your automated marketing campaigns might discriminate against social and ethnic groups. Managers who anticipate these risks and act accordingly will be those who set their companies up for long-term success.

Read more on Marketing or related topics Pricing and Technology

  • AMAlex P. Miller is a doctoral candidate in Information Systems & Technology at the University of Pennsylvania’s Wharton School.
  • KHKartik Hosanagar is a Professor of Technology and Digital Business at The Wharton School of the University of Pennsylvania. He was previously a cofounder of Yodle Inc. Follow him on Twitter @khosanagar.
  • Tweet
  • Post
  • Share
  • Save
  • Buy Copies
  • Print
Partner Center

Harvard Business Review HomeStart my subscription!

Explore HBR
HBR Store
About HBR
Manage My Account
Follow HBR

Harvard Business Publishing:

Copyright © 2020 Harvard Business School Publishing. All rights reserved. Harvard Business Publishing is an affiliate of Harvard Business School.


"Place your order now for a similar assignment and have exceptional work written by our team of experts, guaranteeing you A results."

Order Solution Now

Our Service Charter


1. Professional & Expert Writers: Ace Tutors only hires the best. Our writers are specially selected and recruited, after which they undergo further training to perfect their skills for specialization purposes. Moreover, our writers are holders of masters and Ph.D. degrees. They have impressive academic records, besides being native English speakers.

2. Top Quality Papers: Our customers are always guaranteed of papers that exceed their expectations. All our writers have +5 years of experience. This implies that all papers are written by individuals who are experts in their fields. In addition, the quality team reviews all the papers before sending them to the customers.

3. Plagiarism-Free Papers: All papers provided by Ace Tutors are written from scratch. Appropriate referencing and citation of key information are followed. Plagiarism checkers are used by the Quality assurance team and our editors just to double-check that there are no instances of plagiarism.

4. Timely Delivery: Time wasted is equivalent to a failed dedication and commitment. Ace Tutors is known for the timely delivery of any pending customer orders. Customers are well informed of the progress of their papers to ensure they keep track of what the writer is providing before the final draft is sent for grading.

5. Affordable Prices: Our prices are fairly structured to fit in all groups. Any customer willing to place their assignments with us can do so at very affordable prices. In addition, our customers enjoy regular discounts and bonuses.

6. 24/7 Customer Support: At Ace Tutors, we have put in place a team of experts who answer all customer inquiries promptly. The best part is the ever-availability of the team. Customers can make inquiries anytime.