Microsoft has updated The position of work going into effect astatine The extremity of September and is clarifying that its Copilot AI services should not beryllium utilized arsenic a replacement for proposal from existent humans.
AI-based agents are popping up crossed industries arsenic chatbots are progressively utilized for customer work calls, health and wellness applications, and even doling retired ineligible advice. However, Microsoft is erstwhile again reminding its customers that its chatbots responses should not beryllium taken arsenic gospel. “AI services are not designed, intended, aliases to beryllium utilized arsenic substitutes for master advice,” the updated Service Agreement reads.
The institution specifically referred to its health bots arsenic an example. The bots, “are not designed aliases intended arsenic substitutes for master aesculapian proposal aliases for usage in The diagnosis, cure, mitigation, prevention, aliases curen of illness aliases different conditions,” The caller position explain. “Microsoft is not responsible for immoderate determination you make based connected accusation you person from wellness bots.”
The revised Service Agreement besides elaborate further AI practices that are explicitly nary longer allowed. Users, for example, cannot usage its AI services for extracting data. “Unless explicitly permitted, you whitethorn not usage web scraping, web harvesting, aliases web information extraction methods to extract information from The AI services,” The statement reads. The institution is besides banning reverse engineering attempts to uncover The model’s weights aliases usage its information “to create, train, aliases amended (directly aliases indirectly) immoderate different AI service.”
“You whitethorn not usage The AI services to observe immoderate underlying components of The models, algorithms, and systems,” The caller position read. “For example, you whitethorn not effort to find and region The weights of models aliases extract immoderate parts of The AI services from your device.”
Microsoft has long been vocal astir The imaginable dangers of generative AI’s misuse. With these caller position of service, Microsoft looks to beryllium staking retired ineligible screen for itself arsenic its AI products summation ubiquity.
Editor: Naga