1072857049669742592

In 2016, Nature was asking to 1,500 scientists across multiple disciplines: “Is there a reproducibility crisis?”. The answer was “yes” (https://t.co/o4UmEx2xmf). That answer applies to #AI as well because there’s not enough transparency. That’s the problem, not shadow AI.

1072852294872547328

And you thought it was just about manipulating any face to say anything we want > “in a survey performed with workers from Amazon’s Mechanical Turk, 43% thought that the #AI-generated objects were real” Google’s AI realistically inserts objects into images https://t.co/QBGQpbDknw

1072850466982244353

Nobody challenges their TCO spreadsheets. I learned it after thousands of interactions with F500/GF2000 customers in the last 10 years. Also, the very limited adoption of capacity mgmt/optimizations solutions in the last decade is proof that people underestimate that complexity. https://t.co/kVkFWAEYRS

1072844409442615296

The market shortage of #AI talents doesn’t just mean that it’s difficult to build the smart applications that you want to build. It also means that there’s no way for you to know if the data scientists you already have are doing their job correctly/efficiently/scrupulously or not

1072840634912907264

Reading a Gartner report about “shadow #AI”, a term used to describe the scenario where users bring their own data to build their own AI models. Which seems to suggest nefariously biased algorithms. But “sanctioned AI” isn’t less biased. Ask startups where they get their datasets