The 2024 U.S. presidential election is entering its final stretch, which means state-backed hackers are coming out of the shadows to meddle in their own ways. Among them is Iran’s APT42, a hacking group affiliated with Iran’s Islamic Revolutionary Guard Corps, which Google’s Threat Analysis Group said targeted nearly a dozen people associated with the campaigns of Donald Trump and Joe Biden (now Kamala Harris).
The disaster that has been the data breach from data brokerage and background check firm National Public Data is just getting started. While the breach occurred months ago, the company only publicly acknowledged it on Monday after someone published what they claimed were “2.9 billion records” of people in the United States, the United Kingdom and Canada, including names, physical addresses and Social Security numbers. But ongoing analysis of the data shows the story is far more complicated — and so are the risks.
Now, bike shifters and gym lockers are being added to the list of things that can be hacked. Security researchers revealed this week that Shimano’s Di2 wireless shifters may be vulnerable to several radio-based attacks, which could allow someone to remotely change a rider’s gears or prevent them from changing gears at a crucial moment in a race. Meanwhile, other researchers have discovered that it’s possible to extract the admin keys from electronic lockers used in gyms and offices around the world, potentially giving a criminal access to all the lockers in a single location.
If you use a Google Pixel phone, keep an eye out: An unpatched vulnerability in a hidden Android app called Showcase.apk could give an attacker the ability to gain deep access to your device. Exploiting the vulnerability may require physical access to a targeted device, but the iVerify researchers who discovered the flaw say it may be possible through other vulnerabilities, too. Google says it plans to release a fix “in the coming weeks,” but that’s not enough for U.S. data analytics firm and military contractor Palantir, which is dropping all Android devices due to what it believes was an insufficient response from Google.
But that’s not all. Every week, we round up the security and privacy news we haven’t covered in depth. Click on the headlines to read the full stories. And stay safe.
A U.S. federal appeals court ruled last week that so-called geofencing warrants violate Fourth Amendment protections against unreasonable searches and seizures. Geofencing warrants allow law enforcement to demand that companies like Google turn over a list of all devices that appeared in a particular location at a particular time. The U.S. Court of Appeals for the Fifth Circuit ruled last week that so-called geofencing warrants violate Fourth Amendment protections against unreasonable searches and seizures. ruled on August 9 that geofencing warrants are “categorically prohibited by the Fourth Amendment” because “they are never include a specific user to identify, only a temporal and geographic location where any given user is located can “They appear after the search.” In other words, they are the unconstitutional fishing expedition that privacy and civil liberties advocates have long claimed they are.
Google, which collects the location histories of tens of millions of U.S. residents and is the most frequent target of geofencing warrants, promised late last year that it was changing the way it stores location data in such a way that geofencing warrants They may no longer return the same data as before.From a legal standpoint, however, the issue is far from settled: The Fifth Circuit’s decision applies only to law enforcement activities in Louisiana, Mississippi, and Texas. Moreover, because of weak U.S. privacy laws, police can simply buy the data and skip the pesky warrant process altogether. As for the appellants in the case being heard in the Fifth Circuit, well, they’re no better off: The court found that police used the geofencing warrant in “good faith” when it was issued in 2018, so they can still use the evidence they obtained.
The Committee on Foreign Investment in the United States (CFIUS) fined German-owned T-Mobile a record $60 million this week for its mishandling of data during its integration with US-based Sprint following the companies’ merger in 2020. According to CFIUS“T-Mobile failed to take appropriate measures to prevent unauthorized access to certain sensitive data,” in violation of a National Security Agreement the company signed with the committee, which assesses the national security implications of foreign business dealings with U.S. companies. T-Mobile said in a statement that technical issues affected “information shared from a small number of law enforcement information requests.” While the company claims to have acted “promptly” and “in a timely manner,” CFIUS claims T-Mobile “failed to report some incidents of unauthorized access promptly to CFIUS, which delayed the Committee’s efforts to investigate and mitigate any potential harm.”
The 12-year saga that is the prosecution of Kim Dotcom inched forward this week with New Zealand’s justice minister approving a U.S. request to extradite the controversial businessman. Dotcom created the Megaupload file-sharing service, which U.S. authorities say was used to widely infringe copyright. The U.S. seized Megaupload in 2012 and accused Dotcom on charges related to organised crime, copyright infringement and money laundering. Dotcom has denied any wrongdoing but lost an attempt to block extradition in 2017 and has been fighting it ever since. Despite the Justice Minister’s decision, Dotcom promised in a statement Post on X to remain in the country where he has legally resided since 2010. “I love New Zealand,” he wrote. “I am not leaving.”
The growing scourge of deepfake pornography (explicit images that digitally “undress” people without their consent) may have finally hit a major legal hurdle. San Francisco City Chief Deputy Attorney Yvonne Meré (and the city of San Francisco by extension) has filed a lawsuit against the 16 most popular “nudity” websites. These sites and apps allow people to create explicit deepfake images of virtually anyone, but increasingly children are using them to create sexual abuse material of their underage classmates. While several states have criminalized the creation and distribution of AI-generated child sexual abuse material, Meré’s lawsuit effectively seeks to shut down the sites altogether.