[ad_1]
The House of Representatives passed a bipartisan bill in March threatening to ban TikTok unless its parent company ByteDance sells the app. It may take the Senate months to address the legislation, which faces some opposition, so a ban is not imminent. But the relative success of this approach highlights the narrow, problematic pathway for data security reform in the U.S. as we continue to avoid real oversight.
The authors of the House bill focused on national security, emphasizing concerns about the Chinese government’s access to the data of U.S. citizens who use TikTok. Although dialing up national security concerns was an effective tactic to marshal some consensus, that strategy still may not pass the legislation — and it won’t address the many security concerns that dog tech companies beyond TikTok.
The U.S. uses a so-called risk-based approach to tech oversight, which addresses problems only after they have already developed. By contrast, Japan and some European countries use a precautionary approach that tries to anticipate risks.
Tech firms have flourished financially under the U.S. approach: Companies can gather, use and monetize data with few constraints. In theory, such freewheeling innovation presented only limited national security risks as long as the U.S. was the dominant global tech player, competing with allies and partners. But China and wildly influential apps such as TikTok expose how that approach created a regulatory vacuum that affects consumer safety. TikTok might be less concerning if our federal lawmakers had to date taken comprehensive action to protect consumer data.
Instead of pursuing that broader goal, with its recent act the House targets ByteDance as a foreign policy threat and offers frameworks to designate other “foreign adversary controlled applications.” Perhaps most troubling, the bill extends these considerations to persons deemed “subject to the direction or control of a foreign person or entity” without specifying what form such influence might take.
This would create a highly subjective system that encourages targeting based on national origin and passports — not the type of policy that attracts and retains the best and brightest international talent during an era of extreme competition for technical skills. That is to say nothing of the message that it sends to people in the U.S. who have been rocked by increases in anti-Asian discrimination and the Department of Justice’s China Initiative, which ended because of its controversial broad scrutiny of Chinese academics but which some lawmakers have sought to revive.
A common sense alternative would be to create guardrails that apply to all firms operating in the U.S. That would honor Congress’ mandate to regulate commerce among states that currently have widely varying laws governing data.
It would also allow the U.S. to better align with its global democratic allies and partners, most of which already have comprehensive data security protections for their citizens. Indeed, though China allows alarmingly broad government access to data, it is ahead of the U.S. on protecting consumer data from corporations. For example, China’s Personal Information Protection Law largely aligns with Europe’s General Data Protection Regulation, with both measures tackling corporate handling of personal information.
Action from the U.S. president and Congress is necessary but not sufficient. A complementary strategy in the private sector would be creating metrics to rate companies based on their data security to help investors consider these issues in stock valuation, much like metrics tracking corporate environmental practices. Another possibility is industry standards for data security, comparable to Energy Star ratings that tell consumers a product is energy efficient.
From both the private and public sectors, there should be more funding for data security education through schools and libraries to teach children and adults how to think more critically about the use of their data. To counter the grip that online life has on many users, we also need greater investments for in-person communities, which can strengthen offline advocacy for consumer interests.
The House bill is not what we need. If the flood of calls from TikTok and its users to congressional offices is any indication, it’s also not what many want. The legislation may, however, show the limit of what the current U.S. regulatory landscape can achieve. In a world increasingly defined by data, we’ll have to expand beyond those limits.
[ad_2]
Source link