In a controversial move, the United Kingdom has ordered Apple to create a backdoor that would allow government officials to access encrypted data stored in the cloud by users worldwide. This unprecedented demand, issued under the Investigatory Powers Act of 2016, requires Apple to provide blanket access to all encrypted content, which critics argue undermines the company’s promise of privacy to its users. Should Apple comply, it may stop offering encrypted storage in the UK, but this would not satisfy the demand for access in other countries, including the United States. Security experts warn that this could create significant vulnerabilities in global cybersecurity. The U.K. government’s insistence on access comes amid growing concerns over encryption being used to shield criminal activities, while tech companies continue to advocate for user privacy and the right to secure communication. The situation escalates fears that the U.K.’s actions could prompt similar demands from other nations, ultimately endangering user privacy on a global scale.
In response to increasing cyber threats, including successful attacks attributed to Chinese hackers, Republicans are softening their criticism of the Cybersecurity and Infrastructure Security Agency, or CISA. Under the new administration of President Donald Trump, CISA is expected to focus on protecting critical infrastructure from hacking attacks. South Dakota Governor Kristi Noem emphasized the agency’s essential role in combating ransomware and foreign threats. Despite past calls to dismantle CISA, support for its mission has strengthened, notably from House Homeland Security Committee Chair Mark Green, who highlighted the pressing need to enhance America’s electronic defenses amid growing espionage concerns
Thomson Reuters won a significant early victory in a copyright infringement lawsuit against Ross Intelligence, a legal AI startup, marking a pivotal moment in the ongoing legal debates surrounding artificial intelligence and copyright law. A US District Court judge ruled in favor of Thomson Reuters, stating that Ross’s use of its Westlaw search engine content constituted copyright infringement. The judge’s decision emphasized that Ross’s actions could not be justified under a “fair use” defense, particularly because they created a direct competitor to Thomson Reuters. The case, which has been closely watched as it could set precedents for other similar lawsuits involving major AI firms, highlights the complexities of how AI tools are trained using copyrighted material. Ross, which ceased operations in 2021, argued that its AI was designed to extract legal answers directly from law but was ultimately found to have significantly copied Thomson Reuters’ content, which included unique annotations and summaries written by legal experts.
Why do we care?
The UK is a significant market for Apple, so we’re waiting to see what they do.
In the US, there should be some comfort that CISA may not be as under attack as other agencies. It’s a fast-moving story.
The Thomson Reuters vs. Ross Intelligence case is a big win for traditional copyright holders, setting a legal precedent that AI models can’t indiscriminately scrape and repurpose proprietary content. This ruling raises serious questions for AI firms that have trained models on copyrighted datasets—is generative AI fundamentally at legal risk?