• November 30, 2022

New Galaxy S23 Ultra Leak Reveals Samsung’s Significant Upgrades

Samsung’s forthcoming Galaxy S23 Ultra will gain some significant improvements over the current Galaxy S22 Ultra, according to the latest leaked info. Taking to Twitter, persistent leaker Ice Universe has revealed …

Organizations That Maintain Trust In 2023 Will Thrive

The 2020s are off to a tumultuous start. Individuals have experienced extraordinary political and social upheaval, war on the European continent, the reemergence of infectious diseases, financial instability, scandals, and extreme …

Former Spectrum Strikers Sue Union

Nearly six years ago, around 1800 cable technicians based in the New York City area went on strike against Charter-Spectrum, who they accused of union-busting in the years following Charter Communications’ …

It was the “Sephora shot” heard around the world — to the tune of $1.2 million.

When California fined the beauty brand this summer, it was the first enforcement action under the California Consumer Privacy Act.

The state said Sephora illegally sold consumers’ personal data using third-party trackers to get targeted ads and discounts on analytics. On top of that, California said Sephora failed to follow opt-out requests that its customers made via browser privacy controls.

Retailers have known about the law since it passed in 2018 and went into effect two years later. But many brushed it off, believing it either wouldn’t apply to them, hurt their checkbooks, or affect how consumers felt about their brands. Many instead focused on the European Union’s General Data Protection Regulation, fearful of getting hit with damages similar to the hundreds of millions of dollars of record-setting fines against Meta and Amazon.

That’s where California Attorney General Rob Bonta stepped in.

“Today’s settlement with Sephora makes clear we will not hesitate to enforce the law,” Bonta said in August in widely distributed remarks. “It’s time for companies to get the memo: Protect consumer data. Honor their privacy rights. The kid gloves are coming off. My office will not hesitate to protect consumers.”

For a retailer owned by the global French behemoth LVMH — the folks behind Louis Vuitton and much more — the price may have seemed like a tiny slap on the wrist.

But it was much more for a retailer suffering from smaller and smaller margins and the Amazonification of everything. And was just the first of many moves in a mandatory compliance program under California’s top law enforcement officer.

The California Privacy Rights Act, which expands on the CCPA, became effective on Jan. 1, 2020. And the 30-day warning period given to companies to fix their privacy violations — something California said Sephora received and ignored — goes away.

It’s not just California. Virginia, Colorado, Connecticut, and Utah will have laws similar to the CCPA next year. Congress is also debating federal online privacy legislation.

One message is clear: The sticks are small. But they’re getting bigger and broader.

And for those like Sephora, the adage of “all press is good press” is, for once, not true.

Regulators are awakening the consumer. And consumers are awakening regulators.

It’s become an innovation imperative. Companies must listen to survive.

Technology Has a Trust Problem

Responsible And Ethical Technology Strategy, a Forrester analyst report published on A September 2022, put it bluntly: “Technology has a trust problem.”

Advertisement

Surveying online American adults this year, the firm found that just 59% of them “agree companies they purchase from make every effort to do the right thing morally.”

Consumers don’t just make decisions based on the product or the price. They make it based on trust in the brand and the technology that brings it their way.

Forrester found nearly one-in-five consumers could be identified as “values-motivated.” Those buyers, it said, spent 28% more of their money online each month than the average American online adult. In another Forrester 2021 survey, about one-fifth of data and analytics managers at companies that use or are looking to adopt AI, including facial recognition, said their employees’ lack of trust in the technology was the biggest hurdle to adopting it.

Privacy is a big part of those values. A majority of U.S. customers — 62% — surveyed by Forrester said that the way a company uses their information was “important in their purchase decisions.” Likewise, 39% of Americans surveyed said bad press would make them stop doing business with a company altogether.

The importance of privacy goes beyond the consumer. Employees also seek trust, and firms must compete for talent by promoting better values. Meta’s job offer acceptance rates dropped dramatically after the Cambridge Analytica scandal in just one high-profile example.

Trust in tech is everyone’s problem, not just the more prominent players in high-tech creators or the siloed company lawyers. Yet, ethics are missing from much of the C-suite conversation. For example, just 17% of CIOS and 25% of CTOs told Deloitte they strongly cared about ethical technology use.

So what’s to be done?

The problems can seem vast, expensive, and complicated to players in marketing, legal, and tech. But in reality, the ideas are simple: put trust and power in consumers, give them the control they need over their identities, and let them make the choices that are best for them. Companies must also install their own privacy guardrails, not just follow the bare minimum of the law under the threat of government fines and regulation. Once again, this isn’t about a regulatory burden but an innovation imperative.

This change of mindset to data stewardship on behalf of the consumer will reap dividends and push technological advances further. And, yes, it’s just plainly the right thing to do.

Advertisement

Leave a Reply

Your email address will not be published.