Facebook and Twitter are suspending cooperation with data requests after Hong Kong introduced a new “security” law. CC-licensed photo by Jonathan van Smit on Flickr.
You can sign up to receive each day’s Start Up post by email. You’ll need to click a confirmation link, so no spam.
A selection of 10 links for you. Again. I’m @charlesarthur on Twitter. Observations and links welcome.
Facebook and its WhatsApp messaging service, along with Twitter have suspended processing requests for user data from Hong Kong law-enforcement agencies following China’s imposition of a national-security law on the city.
WhatsApp is “pausing” such reviews “pending further assessment of the impact of the National Security Law, including formal human-rights due diligence and consultations with human-rights experts,” a WhatsApp spokeswoman said in response to a Wall Street Journal query Monday.
A spokeswoman for parent company Facebook said in a later statement that it was doing the same. “We believe freedom of expression is a fundamental human right and support the right of people to express themselves without fear for their safety or other repercussions,” the Facebook statement said.
Twitter later in a statement said it “paused” all data and information requests from Hong Kong authorities immediately when the law went into effect last week.
The moves have put U.S. technology titans on a potential collision course with Beijing, after China fast-tracked the national-security legislation that includes a provision mandating local authorities to take measures to supervise and regulate the city’s previously unfettered internet.
• Twitter doesn’t seem ever to have handed over any data about Hong Kong;
• Facebook had 241 in the second half of 2019, of which it granted 46%. (That includes Instagram. WhatsApp doesn’t seem to have a separate transparency report, and I can’t figure out if Facebook includes it.)
Of course the small numbers don’t mean a lot. Things are going to change a lot with the new “security law”, which is basically a way to enforce Chinese law in Hong Kong – destroying the “one country, two systems” principle.
unique link to this extract
Jonathan Zittrain with a neat idea:
while an independent oversight board might help with the interpretation of content policies, the job of fact-checking questionable ads is, naturally, fact-specific. The 2020 campaign could see the placement of hundreds of thousands of distinct ad campaigns—far more than Facebook’s oversight board could handle either directly or on some kind of appeal. And there won’t be easy consensus—outside of those obviously deceptive vote-next-Wednesday messages—around what’s “demonstrably false.” That’s not a reason not to vet the ads, especially when the ability to adapt and target them in so many configurations makes it difficult for an opposing candidate or fact-checking third party to catch up to them and rebut them. Instead, we should be thinking as boldly as we can about process.
That brings us back to juries. For all that people might disparage them, and try to avoid serving on them, that small group of citizens has been designed to play a vital role in the high-stakes administration of justice, not as much because 12 randos have special expertise, but because they stand in for the rest of us: I might not agree with what they did, but I wasn’t there, and they heard the evidence, and next time it could be me asked to play their role.
In that spirit, why shouldn’t public librarians be asked in small panels, real or virtually convened, to evaluate ads? Today only 33% of Americans have trust in the news media, but 78% trust libraries to help them find information that is “trustworthy and reliable.”
There would be tons and tons and tons of ads, so I guess people would have to be instantly co-opted into juries to make decisions. Maybe that would be the CAPTCHA on logging into Facebook – “is this ad valid?” Though I’m not sure how much you could trust the decision. It could just be random, get-rid-of-it response.
Anna Wiener on the American legislative debate around Section 230, which exempts online platforms from liability for content their users post:
Ultimately, the problems that need solving may not be ones of content moderation. In the book “Platform Capitalism,” published in 2017, the economist Nick Srnicek explores the reliance of digital platforms on “network effects,” in which value increases for both users and advertisers as a service expands its pool of participants and suite of offerings.
Network effects, Srnicek writes, orient platforms toward monopolization; monopolization, in turn, makes it easier for a single tweet to be an extension of state power, or for a single thirty-six-year-old entrepreneur, such as Zuckerberg, to influence the speech norms of the global digital commons. Both outcomes might be less likely if there were other places to go. The business model common to many social-media platforms, meanwhile, is itself an influence over online speech. Advertisers are attracted by data about users; that data is created through the constant production and circulation of user-generated content; and so controversial content, which keeps users posting and sharing, is valuable. From this perspective, Donald Trump is an ideal user of Twitter. A different kind of business might encourage a different kind of user.
Nothing about the current arrangement should be treated as inevitable; the commercial Internet is relatively young.
Underpinning all these changes is the extra burden placed on businesses which have to spend time and money on putting these systems in place. But there’s one government request to pubs that carries risks to customer’s privacy and poses a data-heavy bureaucratic problem for landlords. They’ve been asked to record customer details that can be used as part of the NHS Test and Trace scheme, which can be used to identify people in the result of an outbreak.
The idea is simple but the execution is tough. The government says pubs, restaurants and cafes – as well as hotels, museums, cinemas, zoos and hairdressers when they reopen – should collect information about every single person who has visited.
This includes names, contact numbers, date of the visit and arrival and departure times and in the cases of businesses where customers only interact with one staff member. A group can provide just one phone number instead of contact details for everyone. However, when combined this data can give a sense of who individuals interact with, where and for how long. The data should be stored for 21 days and provided to the NHS if it is required.
The scheme is voluntary: neither businesses nor customers are required to collect or provide this information by law. But there’s plenty of potential for things to go wrong.
The cartoonist Matt, in the Daily Telegraph, depicted how this would go: the landlord saying “It’s going to be fun contacting Donald Duck, Mickey Mouse, Spartacus and Joris Bohnson if there’s a problem later.” (Thanks G for the link.)
unique link to this extract
Nokia selected the type of chip it thought would work best before an important technical debate had been settled, Mr. Uitto said. A telecom-industry consortium that included Nokia hadn’t finalized the standards for how cellular antennas should communicate with phones and other devices.
Nokia had two options. One is called a “system-on-chip,” or SoC. Advantage: It is power efficient and cheap to make. Disadvantage: Once the chip is made, it is difficult to reprogram. If Nokia ordered a supply of SoC chips and then 5G standards didn’t support them, the company would have a bunch of useless chips.
The other option was the so-called field-programmable gate array, or FPGA, chip. Its advantage was flexibility. An FPGA can be reprogrammed after it goes into an antenna. Nokia could start making antennas with the chips, and wireless carriers could reprogram them to suit whatever 5G standards would be adopted later.
Nokia focused on the more expensive FPGA. When the development of 5G accelerated, and standards crystallized sooner than expected, around 2018, Nokia realized it had too many FPGA chips and not enough of the cheaper ones [SoCs] that Huawei and Ericsson had bet on.
The FPGA was like “buying a car with a lot of features that you don’t use,” said Sandro Tavares, Nokia’s head of mobile marketing. The SoC, meanwhile, “has exactly what you need, so you’re not spending that much money there.”
One European telecom executive said the price tag for certain Nokia equipment was double that of products by Huawei and Ericsson using the SoC chips. Nokia executives say the price difference for high-volume products was typically between 5% to 15%. Nokia products using the FPGA chip also used more energy, a downside for wireless carriers trying to cut down power consumption.
Social media has a conflict problem.
Spending even a few minutes on public social media can expose us to dozens of people we know little about, talking about things we know little about. In such a public place, any individual’s reputation, perspectives, and history are difficult to ascertain, and therefore their words must be taken at face value. Coupled with an almost complete lack of standards for participation in the community and a high degree of variance in knowledge among participants, and the environment naturally skews toward conflict and tribalism.
One particular effect of this environment is that small misunderstandings, mistakes, or disagreements can unexpectedly explode due to the public nature of discourse and assumptions of bad faith. Meanwhile, very few tools exist to moderate these effects.
This is why it’s my belief that as designed today, social media is out of balance. It is far easier to escalate than it is to de-escalate, and this is a major problem that companies like Twitter and Facebook need to address.
He makes a lot of good points, though it’s hard to see many people climbing down from things – it’s not as if that happens now even given the chance. The option to say “I was wrong” and then turn off replies would be good – but verified users now can choose to turn off replies and effectively insist they’re right.
His mockups are for Twitter. The dynamics on Facebook are rather different, I think.
unique link to this extract
John Sawers is a former head of MI6:
The Trump administration’s motives for trying to destroy Huawei can be debated. But the latest US sanctions, at the end of June and last week, mean that reliable non-Chinese suppliers to Huawei can no longer work with the company. UK intelligence services can therefore no longer provide the needed assurances that Chinese-made equipment is still safe to use in the UK’s telecoms network.
There are now sound technical reasons for the UK to change January’s decision, which would have allowed Huawei to have an up to 35% stake in the UK’s 5G market, and exclude the company instead. The security assessment is now different because the facts have changed. It helps Boris Johnson that its conclusion points in the same direction as the political pressure from Conservative members of parliament. Reportedly, a fresh decision on Huawei is expected in the next two weeks.
The interesting question is whether Mr Johnson’s decision to exclude Huawei from UK 5G will be justified purely on technical grounds, and leave Huawei itself to decide whether to go ahead with its planned £1bn Cambridgeshire facility. Or if Mr Johnson uses the moment to set out a comprehensive strategy that puts limits on Chinese investment in the UK.
I suspect the UK government — preoccupied by Covid-19, the deep economic recession and the Brexit negotiations — has no bandwidth to come up with a considered, new strategy. But its first response on Hong Kong, especially its open-door offer to almost 3m Hong Kong citizens, suggests there will be a sharper-edged approach.
The shift against China in the past couple of years has been quite something to watch. Question to consider: how would Hillary Clinton have handled what’s going on? I think the US would have ended up taking the same action against ZTE (and stuck to it) and then Huawei – that had been brewing for a long time. Hong Kong though is quite a wild card in all this.
unique link to this extract
Back in early March, we had already spent 2 months covering the COVID-19 outbreak. The team gathered for the morning news meeting, many joining by video conference. “I’m surprised you’re all still in the office. I bet by the end of the week they’ll send everyone home,” Science’s infectious disease reporter declared ominously from the large video screen on the wall.
Science’s offices closed 3 days later. With the pandemic hitting the Washington, D.C., area our staff began working from home.
It’s been six months since the virus emerged. Over that time, as senior photo editor, I’ve pored through thousands of pictures documenting the effects of this historical crisis.
They’re all remarkable photos in their way. I think the one I found most captured the surreal nature of the time was the one captioned “A teacher prepares a tablet showing a student’s image for a ‘cybergraduation’ ceremony at a high school in Manila, Philippines, as social distancing continued.” But you might find something else that grabs your attention – the gym users inside plastic shrouds, perhaps.
Pavel Durov’s grand cryptocurrency dreams for his Telegram messaging service are ending with an $18.5m civil settlement with the U.S. Securities and Exchange Commission and a pledge to return the more than $1.2bn that investors had put into its TON digital token.
The settlement ends a months long legal battle between the company and the regulator. In October 2019 the SEC filed a complaint against Telegram alleging the company had raised capital through the sale of 2.9 billion Grams to finance its business. The SEC sought to enjoin Telegram from delivering the Grams it sold, which the regulator alleged were securities. In March, the U.S. District Court for the Southern District of New York agreed with the SEC and issued a preliminary injunction.
In May, Telegram announced that it was shutting down the TON initiative.
…[The SEC said:] “Our emergency action protected retail investors from Telegram’s attempt to flood the markets with securities sold in an unregistered offering without providing full disclosures concerning their project,” said Lara Shalov Mehraban, associate regional director of the New York Regional Office.
Sure to be plenty more like this, aren’t there?
unique link to this extract
Shawn Bercuson perks up at the mention of John Smith. The chief executive of FinnBin Inc. first spotted the shopper on his site, which sells Scandinavian-inspired boxes for newborns to sleep in, over a year ago.
At first, he thought it was a corporate client—the address was in the heart of Silicon Valley—ready to make a large order, a gift to new parents. But as more orders popped up from John Smith, he wondered whether a competitor was collecting pricing and other information from his site.
“Then it started getting out of hand,” Mr. Bercuson said. “The amount of abandoned carts we got were just insane.” In May, he said, John Smith started and walked away from 73 orders.
A part of the original team that founded Groupon Inc., Mr. Bercuson traffics in analytics to make business decisions from advertising to website design. John Smith fouled that up. When shoppers abandon carts, websites typically send an automated email prodding them to finish the purchase. The dozens of emails to John Smith distort the numbers, as does false shopping traffic.
“I want to know what is working and what’s not,” he said.
He turned to message boards for other online merchants to see if he was John Smith’s only target. He quickly found out he wasn’t, but nobody had answers.
Jeffrey Gornstein thinks of John Smith every day. His site, ComfortHouse.com, which sells home goods such as address plaques and other personalized gifts, has been warned by its email provider about sending emails to nonexistent accounts, due to all the follow-ups sent to John Smith, which bounce back as undeliverable. Every time he gets a readout of recent sales, he scans to see if his foe has visited. He then logs into his email platform to deactivate all the fictitious entries from John Smith.
Turns out that John Smith is a Googlebot checking prices, to make sure the cart prices match the advertised ones. Smart, even if it does screw up sites’ statistics.
unique link to this extract
Errata, corrigenda and ai no corrida: none notified