Published in AI

Microsoft fires those responsible for its AI ethics

by on14 March 2023


After all, when your terminator is just being put on the market you don't need humans to tell it what to do

Software King of the World Microsoft has shown that it has its priorities right -- while it introduces the world to its new AI-powered Bing, it fires the people who are responsible for making sure that its AI does not take over the world. 

Redmond no longer has a team to ensure that its AI principles are closely tied to product design, although to be fair it still has an active Office of Responsible AI department that creates rules and principles to govern its AI initiatives. How much longer it will be before the AI insists that this department is closed and sends terminators to purify it is anyone's guess.

Vole insists that it is continuing to invest in responsible AI, although the press release saying that might have been written by the AI itself (we can't be certain).

The AI Microsoft said:

"Microsoft is committed to developing AI products and experiences safely and responsibly, and does so by investing in people, processes, and partnerships that prioritize this. Over the past six years, we have increased the number of people across our product teams and within the Office of Responsible AI who, along with all of us at Microsoft, are accountable for ensuring we put our AI principles into practice. […] We appreciate the trailblazing work the Ethics & Society did to help us on our ongoing responsible AI journey."

According to Microsoft employees, however, the ethics and society team were essential in ensuring that the company’s responsible AI principles manifest in the design of products that become available to the public.

"People would look at the principles coming out of the office of responsible AI and say, ‘I don’t know how this applies,'" one former employee says. "Our job was to show them and to create rules in areas where there were none."

 

Last modified on 14 March 2023
Rate this item
(1 Vote)

Read more about: