Press "Enter" to skip to content

Why OpenAI and Uber’s Latest Moves Raise Crucial Questions About AI and Labor

OpenAI has halted access to its upcoming video generation tool, Sora, following a protest by a group of artists who leaked access to the tool. These artists raised concerns about their treatment, claiming they were used as “public relations puppets” with minimal compensation for their contributions. Although hundreds of artists were granted early access to test Sora, about twenty of them voiced that OpenAI exploited their unpaid labor.

Uber Technologies has launched a new division called Scaled Solutions, focusing on artificial intelligence training and data labeling services. This initiative aims to connect businesses with independent contractors for vital annotation tasks necessary for training AI models. Scaled Solutions is already serving notable clients like Aurora Innovation and Niantic. The company plans to onboard contractors globally, including in regions such as India, the United States, Canada, Poland, and Nicaragua. Workers will be compensated on a per-task basis, although concerns persist regarding fair pay, particularly in developing countries.

Why do we care?

There’s a problem when your beta testers leak your product.     OpenAI’s reliance on artists for feedback without adequate compensation or transparency reveals a misstep in stakeholder engagement. IT companies must recognize the long-term value of ethical collaboration, particularly when working with creative or technical communities.

I also wanted to highlight how Uber is coming for advanced technical work as gig work.  For IT service providers, this move signals both an opportunity and a warning about the competitive landscape for labor-intensive AI support services.

IT service providers should take these developments as cues to invest in ethical frameworks, robust compliance practices, and differentiated services to sustain trust and competitiveness in a rapidly evolving market.