Skip to main content

9. Prefer transparent and open generative AI

These AI models could be filled with ticking time-bombs.

A person with mechanical arms poses for us. An expression made out of circles and lines.

Summary

The biggest source of errors, bias, and legal risk in AI tools is their training data. It’s next to impossible for an organisation to assess the risk of using AI software whose training data set is undocumented or, worse, kept as a trade secret. Dependency on closed AI software is inherently riskier than other closed software as it leaves you with fewer tools to validate grand claims from the vendor.

To read this post you'll need to become a member. Members help us fund our work to ensure we can stick around long-term.

See our plans (Opens in a new window)

Topic Intelligence Illusion

0 comments

Would you like to be the first to write a comment?
Become a member of Out of the Software Crisis and start the conversation.
Become a member