Theo NPR, các nhà điều hành và nhân viên của TikTok đã ý thức rõ ràng rằng tính năng của ứng dụng này thúc đẩy việc sử dụng bắt buộc của ứng dụng, cũng như các ảnh hưởng tiêu cực tương ứng đến sức khỏe tinh thần, theo tài liệu được công bố gần đây. Cơ quan truyền thông đã xem xét các tài liệu không che khuất từ vụ kiện mà Văn phòng Luật sư Tổng Kentucky đã khởi kiện đưa ra, được xuất bản bởi Kentucky Public Radio. Hơn một tá bang đã khởi kiện TikTok, cáo buộc rằng nó “đã nói dối (rằng nó) an toàn cho người trẻ.” Trưởng Văn phòng Luật sư Tổng Kentucky Russell Coleman nói rằng ứng dụng này “được thiết kế đặc biệt để trở thành một máy nghiện, nhắm mục tiêu vào trẻ em vẫn đang trong quá trình phát triển tự kiểm soát thích hợp.”
Tuy nhiên, TikTok đã bảo vệ công ty và cho biết rằng khiếu nại của Luật sư Tổng Kentucky “chọn lọc các trích dẫn gây hiểu lầm và lấy các tài liệu lỗi thời ra khỏi ngữ cảnh để biểu hiện sai lệch cam kết của chúng tôi với an toàn cộng đồng.” Người phát ngôn TikTok Alex Haurek cũng nói rằng TikTok có “các biện pháp bảo vệ mạnh mẽ, bao gồm xóa các người dùng nghi ngờ dưới tuổi” và rằng họ đã “tự nguyện triển khai các tính năng an toàn như giới hạn thời gian màn hình mặc định, kết nối gia đình và quyền riêng tư mặc định cho người dưới 16 tuổi.”
#TikTok #TeenSafetyLawsuit #KentuckyAG #SafetyFeatures #SocialMediaImpact
TikTok’s executives and employees were well aware that its features foster compulsive use of the app, as well as of its corresponding negative mental health effects, according to NPR. The broadcasting organization reviewed the unredacted documents from the lawsuit filed by the Kentucky Attorney General’s Office as published by the Kentucky Public Radio. More than a dozen states sued TikTok a few days ago, accusing it of “falsely claiming (that it’s) safe for young people.” Kentucky Attorney General Russell Coleman said the app was “specifically designed to be an addiction machine, targeting children who are still in the process of developing appropriate self-control.”
Most of the documents submitted for the lawsuits had redacted information, but Kentucky’s had faulty redactions. Apparently, TikTok’s own research found that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety.” TikTok’s executives also knew that compulsive use can interfere with sleep, work and school responsibilities, and even “connecting with loved ones.”
They reportedly knew, as well, that the app’s time-management tool barely helps in keeping young users away from the app. While the tool sets the default limit for app use to 60 minutes a day, teens were still spending 107 minutes on the app even when it’s switched on. That’s only 1.5 minutes shorter than the average use of 108.5 minutes a day before the tool was launched. Based on the internal documents, TikTok based the success of the tool on how it “improv(ed) public trust in the TikTok platform via media coverage.” The company knew the tool wasn’t going to be effective, with one document saying that “(m)inors do not have executive function to control their screen time, while young adults do.” Another document reportedly said that “across most engagement metrics, the younger the user, the better the performance.”
In addition, TikTok reportedly knows that “filter bubbles” exist and understands how they could potentially be dangerous. Employees conducted internal studies, according to the documents, wherein they found themselves sucked into negative filter bubbles shortly after following certain accounts, such as those focusing on painful (“painhub”) and sad (“sadnotes”) content. They’re also aware of content and accounts promoting “thinspiration,” which is associated with disordered eating. Due to the way TikTok’s algorithm works, its researchers found that users are placed into filter bubbles after 30 minutes of use in one sitting.
TikTok is struggling with moderation, as well, according to the documents. An internal investigation found that underage girls on the app were getting “gifts” and “coins” in exchange for live stripping. And higher-ups in the company reportedly instructed their moderators not to remove users reported to be under 13 years old unless their accounts state that they indeed are under 13. NPR says TikTok also acknowledged that a substantial number of content violating its rules get through its moderation techniques, including videos that normalize pedophilia, glorify minor sexual assault and physical abuse.
TikTok spokesman Alex Haurek defended the company and told the organization that the Kentucky AG’s complaint “cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.” He also said that TikTok has “robust safeguards, which include proactively removing suspected underage users” and that it has “voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16.”
[ad_2]