OpenAI’s board meetings were set to face a unique situation with two major partners, Microsoft and Apple, acting as observers. However, this scenario has been avoided as both companies have decided against the observer roles. Industry experts believe that Microsoft and Apple shouldn’t observe OpenAI’s board meetings due to potential conflicts of interest.
Microsoft, OpenAI’s largest partner and investor, had been observing board meetings since last year’s leadership crisis when OpenAI unexpectedly dismissed and then reinstated CEO Sam Altman. Recently, there were reports that Apple, planning to integrate ChatGPT into iOS, would also join as an observer.
Today, it was announced that Microsoft has ended its observer role effective immediately, and Apple will not take up the role either. Microsoft communicated to OpenAI in a letter that its seven months of observing meetings provided valuable insights without compromising the board’s independence. Microsoft expressed confidence in OpenAI’s direction and noted that the observer role was no longer necessary.
Instead, OpenAI will now hold regular meetings with strategic partners and investors. This approach is considered more appropriate for several reasons.
Regulatory Scrutiny
To avoid regulatory scrutiny, Microsoft and Apple shouldn’t be observing OpenAI’s board meetings. Firstly, European antitrust officials had earlier dropped an investigation into Microsoft’s relationship with OpenAI, acknowledging that Microsoft had not taken control of its partner.
However, a broader EU review is now underway, examining strategic investments and partnerships across the AI industry. Regulators in the U.K. and U.S. are also closely monitoring these relationships. Regulators will likely agree that Microsoft and Apple shouldn’t observe OpenAI’s board meetings to ensure fair competition.
Concerns from Industry Experts
Secondly, industry experts have raised concerns about major customers sitting in on board meetings. A retired chair of a publicly traded company noted that having customers or large shareholders as observers could hinder candid, serious, and confidential discussions. To maintain confidentiality, Microsoft and Apple shouldn’t observe OpenAI’s board meetings where sensitive information is discussed.
Open-ended observation could lead to issues, whereas defined input and discussion during specific meetings are acceptable. This concern is heightened when considering competitors like Microsoft and Apple in the same room.
Elon Musk’s xAI Moves to Build Own Data Centers
In related AI news, Elon Musk’s xAI startup has decided to build its data center systems, ending its server rental agreement with Oracle. Musk confirmed the decision on X (formerly Twitter), emphasizing the need for speed to compete with established rivals. Musk stated that xAI must control its operations to move faster than competitors.
U.S. Accuses Russia of AI-Powered Disinformation Campaign
In another significant development, the U.S. has accused a foreign country for the first time of using generative AI in an influence campaign. According to Reuters, Russia is alleged to have employed an AI-powered bot farm to spread disinformation in the U.S. and other countries.
These developments highlight the dynamic and sometimes contentious nature of the AI industry, where strategic decisions and international actions continue to shape the landscape.
Looking Forward
OpenAI’s new approach to holding regular meetings with strategic partners, rather than having them as board observers, seems more sensible. This method ensures that partners can still offer valuable input without compromising the board’s independence. It also helps address regulatory concerns, particularly with ongoing scrutiny from European, U.K., and U.S. regulators who are keen on ensuring that no single partner exerts too much influence over the company.
In another strategic move, Elon Musk’s xAI has decided to build its data centers, moving away from renting servers from Oracle. This decision highlights the importance of control and speed in the AI market. Owning data centers allows xAI to innovate more rapidly and efficiently. Lastly, the U.S. accusing Russia of using AI for disinformation campaigns highlights the need for robust international regulations to prevent the misuse of AI technology.