Fly On Wall Street

“The commoditization of ‘AI’ and M365 Copilot is a veritable gold rush”: Microsoft MVP warns of ‘bullsh*t’ from inexperienced professionals

What you need to know

With the recent push by Microsoft to integrate Copilot into literally everything, especially in Microsoft 365, there is going to be a gold rush on AI integration and tech companies offering “expertise” to large corporations to assist with Copilot onboarding. On his blog, Loryan Strant explained the importance of trying to vet and identify individuals or companies that might be offering consultation services on Copilot but with very limited actual hands-on experience in an enterprise environment.

Rising demand for AI could lead to deceptive sales and advertising as people jump in on the gold rush of AI

Strant makes some great points throughout his article. As a Microsoft MVP, it is likely that Loryan has to deal with more intellectual BS from those pretending to be experts than the normal Microsoft fan or end-user. AI is a booming industry right now, the rising demand for AI is giving NVIDIA a huge boost right now making them the most profitable chip manufacturer in the world.

If we aren’t sold on the importance of Copilot to Microsoft, nothing is more of a testament to how integral Copilot is to Microsoft’s future than the recent escapades of Satya Nadella as he, in my opinion, single-handedly secured Sam Altman’s return as CEO of OpenAI.

This all leads to an environment where fraud, deception, and at the bare minimum a little bit of dishonesty will run rampant as AI-focused startups and tech salesmen could pretend to be experts but are really just going “off the cuff” without any hands-on experience.

There’s a saying in IT (and I’m sure many other industries too) that as a consultant you only need to be 1 page ahead of the customer. I’ve always hated that saying, because it’s flat-out deception. In taking this stance, “professionals” are effectively comfortable with deceiving the customer about their level of knowledge – and in some cases this can lead to disastrous results such as data privacy breaches. And in the scenario where organisations are giving “AI” access to their content and information, it’s an incredibly dangerous thing to do.

Loryan Strant

I’m personally not acquainted with this idea of only needing to be one page ahead of a customer to be a consultant, but I have worked with enough supposed professionals and technical account managers (TAMs) who are supposed to be experts on a given product to know that a lot of the time these people are just flying by the seat of their pants.

Strant continues on explaining that the expensive cost of Microsoft 365 for enterprise by itself means that a lot of people aren’t going to have their hands on it yet. “Beyond the fact that the product has barely rolled off the shelves, is the fact that the product is damn expensive and carries a minimum purchase quantity. We’re talking a minimum of 300 licenses – which amounts to USD ~109k.”

During that time, the products have reportedly changed multiple times per Strant and the need to get an actual expert on the product would probably be important for corporations looking to integrate Copilot safely.

“What I am saying, is that if you are looking for a partner or consultant to help you with M365 Copilot in your organization – challenge them to prove their knowledge and experience. Challenge them to show unique value that goes over and above what is publicly available with a basic web search.”

Loryan Strant

Personally, I do agree with Loryan, in that “right now, the commoditization of “AI” and M365 Copilot is a veritable gold rush.” Nearly every technology and cybersecurity conference going on discusses AI. Microsoft Security Copilot might help security responders be more accurate, but at the same time, the power and speed of generative AI can assist attackers write malware, assist with better, more convincing social engineering attacks with phishing, vishing, and now smishing attacks becoming more and more prevalent.

A lot of Chief Information Security Officers (CISOs) are hesitant to place their trust in a generative AI which has the main purpose of learning and spreading information. For regulated industries like Health Care that deal with HIPPA laws, or government contractors that deal with classified information, the idea of giving an unproved technology like Copilot access to information that could lead to huge fines from government regulatory action in the case of a data breach, is a bridge too far for most CISOs.

The onus is on Microsoft, Google, X, and other companies that are pushing these AI solutions for enterprise customers to prove that generative AIs can be safe, and that the data will be protected, even in a worst-case scenario.

Exit mobile version