"Why should I care if China knows how many cat videos I see on TikTok?"

This was a question posed by a listener on NPR just now, read by the NPR interviewer to a couple of “experts” explaining why Tiktok is a danger to U.S. citizens, and you know what? These experts could not answer the very simple question. They spoke of “the aggregate effect” and how China and the U.S. are both responsible, not just China, and yadda yadda yadda, but they couldn’t even try to answer the question as posed. They did ask if we could imagine if Russia had owned Facebook in 2016, and seemed to think that was a satisfying response to a question about China in 2023, so I’ll ask: how is the average U.S. internet user being endangered by Tiktok?

In fairness to the experts, I think the same question could be asked by someone who uses Facebook to look at cat videos. “Why should I care if Mark Zuckerberg knows how many cat videos I see on Facebook.”

But in the case of Facebook, we know what the risk is to the average US internet user. My own parents fell into the conservative misinformation bubble on Facebook without even trying (my dad thankfully got out, but my mom hasn’t yet). Facebook has torn apart families, ended relationships, and contributed to our growing political divide. Every grandparent who got addicted to cat videos in 2012 and now no longer is on speaking terms with their grandchildren has been personally affected by Facebook.

And the terrifying thing is, Facebook did this all with nothing more than algorithms that promoted engagement. They were politically neutral. We have no idea what metrics TikTok has optimized their algorithms for or what further damage they may cause to families or society.

Now… this is the average internet user we’re talking about, it may certainly be possible to use TikTok strictly for cat videos in small doses. But my kids are on TikTok and I can assure you their content is very political. Plus now they get nonstop adds for sports betting. Thanks, Ohio.

Cat videos on social media are the “Free Candy!” written on the side of the white van. It lures you in.

Neither answer here is much more responsive than the “experts” answer was.

This seems like a very simple case to make, if there were one to be made. I have no idea what it is, but I’d start out with “There may be no danger to someone who uses Tiktok exclusively to watch cat videos, but once you start viewing cat videos, the next insidious step is that Tiktok might then [do awful things to you that I am asking about]…”

I’m quite sure there is an answer to be made. I just don’t understand why people who know the answer refuse to make it, and insist on obfuscating what those dangers are.

I’ve already described the awful things that social media can cause. TikTok isn’t any better than any of them, and it’s slightly worse in that the US has no regulatory control over it. Not that the US is regulating any social media sites all that well.

This is a great indictment on all social media, IMHO.

Here is an article, among many, pointing out the risks of using TikTok, specifically…

One way hackers take advantage of TikTok is by sending users a text message that allows them to access their accounts.

Another is leveraging the fact that TikTok uses an insecure HTTP connection to deliver videos instead of the more secure option, HTTPS. This allows cybercriminals to manipulate users’ feeds and plant unsolicited content that could be misleading or disturbing

So, it seems TikTok can be a trojan horse to allow hackers to compromise your smart phone. That’s good enuf reason to stay off that app.

So true.

So you’re essentially saying that the terrible danger of Tiktok to U.S. users is that it’s the same as any other social media site, none of which are regulated by the U.S. government in a meaningful way?

Again, it feels to me like I’m putting a soft question (see thread title) on a tee for you to kick a very long distance and you’re just tapping it with your small toe. Show me the harm (or potential harm) being done by Tiktok to a watcher of cat videos.

The thing is, as well as the potential dangers, this kind of decision is at least partly motivated by a trade war and politics. That’s the real answer.

China blocks a lot of Western web sites, particularly social media. I think this is more motivated by “protecting” Chinese citizens from hearing about Tiananmen, or anything else that puts the Communist party in a bad light, but regardless, it also gives Chinese startups a huge protected market.

So…yeah, there are security concerns, but just like with the 5G hoo-har, it’s also partly tit-for-tat and giving American companies more opportunity (Vine for the win!)

State and non-state actors have used social media to try to influence elections. China is most likely using TikTok to influence the globe in ways we’re barely able to identify. Society should be concerned about this. You have to see the forest. Nobody cares about a tree.

Because that’s not where the risk lies. If it hasn’t changed, the TikTok app requires a lot of permissions on your phone and sends back a lot of data about how you use your phone. It also uses an algorithm to predict what you want to see, and can easily manipulate that algorithm to push you towards certain beliefs.

Do other social media companies have this? Some of it, at least. But the difference is that TikTok being in China means it might not only be using the data for profits, but as part of an influence operation. Hence it is considered worse.

Do I think it matters? Not that much. But I wouldn’t mind the result of TikTok having to move its Western operations to another location. That’s what would happen if it gets banned.

Here’s a video from Beau of the Fifth Column on why bills banning TikTok are gaining traction:

Again, argument by analogy, by misdirection, by broadening the subject. Again, I ask: what harm is being done, in fact or in potential, to a Tiktok viewer of cat videos?

Just BTW, I’ve never used Tiktok, not even once, and have no interest in it at all. I just heard on the radio what sounded to me like a straightforward question that got (and is getting) a lot of sideways answers.

No, it’s people answering the actual question, which is why TikTok is considered dangerous. It is focusing on one specific situation to the exclusion of all others that is obfuscating.

I’ve asked this question every time someone starts in on the “privacy” bogeyman. The answer always boils down to some version of “targeted ads”, which just makes me shrug my shoulders and go “ok… so?”. It’s not like the alternative is no ads.

First of all, ‘cat videos’ does not describe everything being done on TikTok. But even if it was, they aren’t just recording that you watched a cat video. They are also collecting:

  • where you went all day, in detail
  • everything in you your camera roll is available to them
  • anything you say if you accepted TikTok’s defaults which give them access to your microphone.
  • Your entire contact list on your phone.

It shouldn’t be hard to see the risks in this. First of all, the ‘big data’ collections will allow them to do all kinds of nefarious things at scale. Grabbing the contact list of a politician or their kids could get them personal phone information, etc. A microphone could be turned on whenever location data shows the phone to be in a place where interesting conversations might be held (say, in the home of a business exec or politician). With enough location data, a kid’s phone could be used to track or predict the locations of their parents if their parents are valuable targets.

The list goes on. The data being collected by Tiktok and other apps has a lot of potential for misuse. The difference between Tiktok and others is that Tiktok is controlled by the Chinese State Communist Party, which is not shy about harvesting information from the west to use against them and can leverage the data with the power of a state.

Pretty hard.

I went to the grocery store today.
I took pictures of an art exhibition at a museum and some sunset/beach photos
I spoke to a salesclerk and the lady who sold me a museum ticket.
My contact list is mostly friends and family plus a bunch of ladies I used to date and I’m too lazy to delete.

What risks? Sounds to me like some pols are terrified of imposing on their own teenagers the downsides of public life so are trying to terrorize the rest of us into not enjoying cat videos.

Well first of all, not everyone fits the user persona designed around “Most Boring Person Imaginable”. There are plenty of people who work in sensitive areas of government or corporate who probably don’t want everything their camera sees and hears collected by an unknown party for unknown reasons.

People might not want what they do or where they go or who they are with recorded for any number of reasons. Keeping in mind that TikTok is owned by a country that uses social media and other collected data to create a personalized score with real world consequences..

Right here in the USA you have the billionaire who owns Madison Square Garden using data and facial recognition software to ban people from the venue based on his personal shit list.

Some people might not want the wrong people to know stuff about their medical history, politics, alcohol consumption, or how gay they are or are not and then using that information to make decisions on hiring, firing, promotions, extending credit, home ownership, etc.

But the biggest risk isn’t China “knowing how many cat videos you watch”. It’s having the ability to decide WHICH cat videos show up in your feed and skewing your perception of the world in a manner that is economically and politically favorable to China (or whoever).

The danger of this data comes when it’s collected en masse and then used to derive patterns of behaviour as well as the risk to individual security.

For example, during the cold war the Soviets tracked the food preferences of various intel groups in Washington. They also tracked delivery people from Washington restaurants. So when they’d see an unusual number of pizza deliveries to an office they’d know that various people are working overtime. So for example if the Soviets planned to send soldiers to Syria, and suddenly the Middle East group was working overtime, it would tell them that perhaps they have a leak. Or if they see pizza orders flooding into a group at the Pentagon, it might confirm their evidence that the U.S. was about to engage in some military action.

Spycraft is all about information. Cell Phones are a treasure trove of information. What information might the Chinese get if location data from the kids of Washington politicians shows a large number of them were picked up from school early? How about a child of a politician whose microphone is turned on by the CCP every time he is in the location with Mom, who sits on the intelligence committee?

Then there are the commercial uses. China might have known about layoffs at Google before the press did, because an AI might have sniffed out a strange pattern on the phones of all the employees who have tiktok installed. They might be able to determine what Google is working on by the pattern of movements and growth in numbers of employees in specific departments.

This isn’t at all about, “I don’t break the law, so I have nothing to worry about.” It’s about the dangers of mass data collection inside a country by another country hostile to it, and with a demonstrated willingness to use information for nefarious purposes.

The rise of AIs that can process humongous amounts of data and find correlations and causes makes it infinitely more dangerous. We may not even know right now what the Chinese will discover once they start trolling the data with AI. But a query such as, “Find every politician who is likely to be having an affair” is very possible, and very useful for blackmail and control.

Why should I care if someone is planting things in my apartment that set off a Geiger counter? I don’t even own a Geiger counter.

But maybe, “sets off a Geiger counter” isn’t the feature of those things that I should be worried about.

This is the old tired “I have nothing to hide, so what’s the big deal with surveillance?” argument, which has been refuted many times before.

Stop thinking about it in terms of “What’s so bad about it them knowing this?” and reframe as “Why do they need to know this?” because they don’t. There’s no reason why they need to know all these details about where we go, who we’re with, and who we know. Asking for it isn’t a good enough reason, even as the price of endless cat videos. I don’t care if it’s the Chinese government or an American company, nobody needs to know these details about me and they sure as hell don’t deserve to make money off that information.

Anyway, as folks have already said, watching cat videos (and knowing that you watched cat videos) is not the real danger of social media in general and TikTok in particular, it’s the fact a unknown algorithm is feeding you cat videos, along with other videos it thinks you’ll like, along with other videos it wants to make you like. Meanwhile, videos from queer people, or black women, or other marginalized groups get limited and silenced because TikTok and their algorithm have decided nobody should like those videos. That’s the real danger, the cat videos just get you in the door and the data collection is the price of admission.

I was listening to the ‘all-in’ podcast (a bunch of venture capitalists playing poker and talking about various things), and the subject of TikTok came up. One of them made the claim that TikTok feeds western children mindless diversion (“Cat Videos”, etc) with a very addictive algorithm, while Chinese children get fed educational material and other ‘enriching’ stuff. I don’t know if it’s true, but it’s another way TikTok could be hurting western kids.