Facebook's Time Management' tool shows it hasnt stopped treating users like psychological guinea pigs
Facebook's ’Time Management' tool shows it hasn’t stopped treating users like psychological guinea pigs
Facebook is evolving to tackle problems the company itself unwittingly enabled, like election meddling and screen addiction. But the fast-paced nature of Silicon Valley is ultimately holding it back from true accountability.
On Wednesday, Facebook launched "Time Management," or what are essentially tools that allow users to see how much time they're spending in the app, mute notifications, and to set a notification to limit how much they use the app.
Time Management is part of Facebook's overarching goal to improve "wellbeing" on Facebook. Or, to make people enjoy the time they spend on Facebook again, and reverse users' perception of it as an addictive, time-sucking venue for spam and endless flame wars.
High-caliber, academic research informs the initiative. Through a team of experts, and both internal and external studies, Facebook established that spending time "intentionally" rather than passively on Facebook — interacting directly with friends and family — makes people feel more supported, confident, and just happier.
Here's where things get a little trickier. From here, Facebook thinks: Great, let's get more of that intentional engagement, please! Unfortunately, the answer Facebook has provided for how to increase that "intentionality" seems less scientifically rigorous.
According to briefings with Facebook representatives, the company did not rely on behavior change research or experts to determine, before shipping the product, that "visibility" into time spent and the three tools would be an effective way to increase intentionality.
"That's not what the research was looking at," Facebook's director of research David Ginsberg told Mashable. "The research was looking at whether this way of presenting the information is useful to them, whether they want this information, and is this helpful."
To see if "Time Management" has any impact on intentionality, Facebook is relying on the standby methods of the tech world: to ship a product, see how users like it, and if it's useful — and refine from there.
"This is a beginning not an end," Ginsberg said. "This was a starting point, and we'll listen to the feedback from the community as they use this."
That approach might be fine for products like reactions buttons, or a new camera product. But for a product meant as part of a solution to a psychological problem, not seeing the research through from problem to solution isn't good enough. It requires users to be the beta testers for their own happiness. And on that subject, real-time testing doesn't cut it.
In December 2017, Mark Zuckerberg announced that Facebook would be refocusing to ensure high quality, meaningful engagement between family and friends, instead of focusing on news. That was based on research, helmed by Ginsberg, that showed that meaningful time made users happier.
"We have a team of experts who focus on the question about, and the issues around, wellbeing and social media and that relationship," Ginsberg said. "When you're using Facebook in an intentional, and in an active, and in an engaging way, it tends to be associated with increases in wellbeing over time."
Time Management is the next iteration of that process. It is one facet of Facebook's answer to the question of, well, if intentionality is good, how do we get people to be more intentional about the time they spend on Facebook? Our research shows X, therefore, we will do Y.
But the problem is that there's a step missing from that equation. With Facebook's stated serious devotion to the wellbeing of its users, Facebook should have conducted research to show that this is something that will actually help — before launching the product.
Facebook said it incorporated "guidance from external experts and top scholars" in the products. But according our recent conversation with Ginsberg, Facebook did not consult behavioral research in the creation of these products. Instead, it designed these products in response to requests from users.
"One of the things we've heard from our community is that they want better tools to understand their use on Facebook, and to be able to have reminders that they're not slipping into that passive, unintentional use, that can happen," Ginsberg said. "I wouldn't say that [Time Management] is the first, or the best, I would say that this is one way of many that we are going down the road on."
Here's what Facebook did do: In small groups, it tested to make sure the Time Management tools were easy to use and understand. And it got internal feedback from employees who had early versions of the feature.
"People are excited to have these tools," Ginsberg said. "Obviously those are small focus group kind of tests, and we'll have to see how the community reacts when it's rolled out at large."
But as my colleague Karissa Bell pointed out, Facebook has yet to point to any research that shows that these tools will be effective. Instead, Facebook's overarching approach, typical of processes in the tech world, is to see if and how Time Management helps people, and refine the product from there.
"This is a pretty common approach in tech – roll a product/feature out, see how people use it, and then improve it over time," Facebook representative Gretchen Sloan told Mashable over email. "It helps us (and other companies in our space) quickly design and ship new features that people want."
This may have worked in the past, but, for issues of "wellbeing" and mental health, this same standard shouldn't apply — just imagine launching a treatment method to any other psychological problem without first ensuring that the remedy would be effective.
Facebook has received criticism for conducting covert psychological or sociological tests on its users. It manipulated newsfeed to see how happy vs. sad posts would affect people's moods, and it tested how adding an "I Voted" button improved voter turnout. People did not appreciate that Facebook was using its billion-strong user base to better understand behavior without their knowledge or consent.
Time Management doesn't unwittingly manipulate people's behavior. And, aside from the fact that it's a pretty feel-good product meant to show that Facebook cares, it seems to be an earnest attempt to help make the Facebook experience better.
But Time Management is meant to be one solution to a problem with serious psychological consequences, including depression and stilted development in teens. And Facebook did not apply the same rigor to the execution as it did to finding a potential solution to the problem. It's going with a throw everything to the wall and see what sticks approach, which makes us have less faith in this as a solution. And it also relies on users to help solve the problem that Facebook created, without going through the due process to say this is something that works.
Look, it's clear that Facebook is earnestly trying to help its users enjoy Facebook more. But with everything Facebook has put us through this year, we are not going to applaud a product Facebook has rolled out without the backed up evidence that this is something that can, will, and does work — based on science. Facebook still doesn't get that it can't treat psychology the same way it treats everything else. Fuck "fuck it, ship it." When it comes to user psychology, user product testing and incremental refinement should come later, not first.
via Social Media https://ift.tt/2DCFv97
August 1, 2018 at 11:21AM