May 19, 2020

Facebook Will Pay $10 Million to Settle 'Sponsored Stories' Lawsuit

Facebook
Mark Zuckerberg
Technology
Sheryl Sandberg
Bizclik Editor
2 min
Facebook Will Pay $10 Million to Settle 'Sponsored Stories' Lawsuit

After being accused of violating its users’ rights to control how their names, photographs and likenesses are used, Facebook has agreed to pay $10 million to charity.

Facebook was sued by a group of five users who alleged that the site’s “Sponsored Stories” violated California law. If you’re even a somewhat savvy Facebook browser, you’ve probably noticed Sponsored Stories popping up on the side of your news feed. They’re advertisements that appear on the site containing the name and profile picture of a user’s Facebook friend, stating that said friend “likes” the item being advertised.

Not only is this often not true, but Facebook also wasn’t paying users being used in the ads as referrals or allowing them to opt out.

Court documents quote Facebook COO Sheryl Sandberg saying that the value of a friend-endorsed Sponsored Story has two to three times more value than a standard Facebook.com ad. Mark Zuckerberg was quoted as referring to a trusted referral as the “Holy Grail” of advertising.

US District Judge Lucy Koh found that the plaintiffs proved that Facebook could cause economic injury by using people’s names, likenesses and pictures in Sponsored Stories.

With its recent IPO drama, the expansion of its Menlo Park headquarters, the purchase of Instagram and rumors of upcoming purchases and launches swirling, Facebook surely would have rather held on to the $10 million settlement. But the company should consider itself lucky—previous court documents state that the proposed class-action lawsuit could have included just about one of every three Americans, leading to a possible payout of billions of dollars.

Although the lawsuit was actually settled last month, it was just made public over the weekend with the release of court documents.

Share article

Jun 12, 2021

How changing your company's software code can prevent bias

Deltek
diversity
softwarecode
inclusivity
Lisa Roberts, Senior Director ...
3 min
Removing biased terminology from software can help organisations create a more inclusive culture, argues Lisa Roberts, Senior Director of HR at Deltek

Two-third of tech professionals believe organizations aren’t doing enough to address racial inequality. After all, many companies will just hire a DEI consultant, have a few training sessions and call it a day. 

Wanting to take a unique yet impactful approach to DEI, Deltek, the leading global provider of software and solutions for project-based businesses, took a look at  and removed all exclusive terminology in their software code. By removing terms such as ‘master’ and ‘blacklist’ from company coding, Deltek is working to ensure that diversity and inclusion are woven into every aspect of their organization. 

Business Chief North America talks to Lisa Roberts, Senior Director of HR and Leader of Diversity & Inclusion at Deltek to find out more.

Why should businesses today care about removing company bias within their software code?  

We know that words can have a profound impact on people and leave a lasting impression. Many of the words that have been used in a technology environment were created many years ago, and today those words can be harmful to our customers and employees. Businesses should use words that will leave a positive impact and help create a more inclusive culture in their organization

What impact can exclusive terms have on employees? 

Exclusive terms can have a significant impact on employees. It starts with the words we use in our job postings to describe the responsibilities in the position and of course, we also see this in our software code and other areas of the business. Exclusive terminology can be hurtful, and even make employees feel unwelcome. That can impact a person’s desire to join the team, stay at a company, or ultimately decide to leave. All of these critical actions impact the bottom line to the organization.    

Please explain how Deltek has removed bias terminology from its software code

Deltek’s engineering team has removed biased terminology from our products, as well as from our documentation. The terms we focused on first that were easy to identify include blacklist, whitelist, and master/slave relationships in data architecture. We have also made some progress in removing gendered language, such as changing he and she to they in some documentation, as well as heteronormative language. We see this most commonly in pick lists that ask to identify someone as your husband or wife. The work is not done, but we are proud of how far we’ve come with this exercise!

What steps is Deltek taking to ensure biased terminology doesn’t end up in its code in the future?

What we are doing at Deltek, and what other organizations can do, is to put accountability on employees to recognize when this is happening – if you see something, say something! We also listen to feedback our customers give us and have heard their feedback on this topic. Those are both very reactive things of course, but we are also proactive. We have created guidance that identifies words that are more inclusive and also just good practice for communicating in a way that includes and respects others.

What advice would you give to other HR leaders who are looking to enhance DEI efforts within company technology? 

My simple advice is to start with what makes sense to your organization and culture. Doing nothing is worse than doing something. And one of the best places to start is by acknowledging this is not just an HR initiative. Every employee owns the success of D&I efforts, and employees want to help the organization be better. For example, removing bias terminology was an action initiated by our Engineering and Product Strategy teams at Deltek, not HR. You can solicit the voices of employees by asking for feedback in engagement surveys, focus groups, and town halls. We hear great recommendations from employees and take those opportunities to improve. 

 

Share article