Managed.IT - issue 68

www.managedITmag.co.uk 27 AI continued... Huntingford’s example of asking Copilot for salary details highlights the fear that many have about AI and data security, which he suggests is more of an internal risk than an external one. “Microsoft’s pitbulls, ninjas and barbed wire fences are bigger than anyone’s, so if you are using Copilot on your ecosystem, localised in your tenant, no one else can get that data. That’s not the problem. The problem is over-sharing inside your tenant. If I have a document that says ‘salary raises’ and I put it on my desktop and I share it with Johnny, Johnny will have access to that document. But say Johnny decides to share it with Rebecca and he clicks Share, all of a sudden your entire company has got access to that data directly through Copilot. Security by obscurity does not exist any longer. You can’t have a nested folder structure of 50 things and store your secrets in that bottom folder. Copilot acts as a loud hailer for data and when organisations turn it on, they find this out very fast and many quickly turn it off. It’s got nothing to do with data leaking out. It’s data being shared inwards that’s the biggest problem.” So, what should organisations do to overcome this? “There are three things I ask organisations all the time. Number one, where is your data? Can you categorically tell me where it is? Number two, what data do you have? What data is stored in your ecosystem? Number three, who has access to your data? Most people I talk to cannot answer those three questions. “When you use a visibility tool like Microsoft Purview, if you are legally allowed to, you can see everything – every Teams message I’ve sent, every document I’ve shared. You can start running reports to see where the data is, what data is available, and who has access to that data. Then you can start looking at how to minimise risk. I call it ring-fencing. How do I ring-fence areas of that data? How do I restrict certain things and not others? If you can get that right, you’re on a good journey to using AI.” Customer concerns Data security and data leakage are big concerns raised by customers. Another is fear over digital equity or inequity, for example if one employee is given AI and another isn’t and earns productivity bonuses as a result. However, the main obstacle to wider AI adoption, according to Huntingford, is suspicion that AI is just a fad. “I can remember the time when Apple released a tiny device with a screen, and that was seen as a fad. Now, we live in them; we use these devices to make payments, to get onto airplanes, to manage our diaries. Where we are with AI is akin to the Nokia flip phone era. It wasn’t the phone that made iPhones popular but the apps and it will be the same with AI. Right now, customers are still saying ‘we don’t see where this is going to be relevant to us’.” Huntingford points out that people tend to have very polarised responses to AI – “It’s either amazing or it’s going to come and take over the world” – adding that at this pivotal time in AI’s development it’s vital for all parties to understand its significance and importance. “The other thing, on the people side, is finding talent. Because AI is so new, where do we find all the AI‑ers? I have only been in AI for two years. Before that I was a Power Platform person and before that I was a Data and Dynamics person. Not one of these new CAIOs, Chief Artificial Intelligence Officers, has arrived with 20 years’ experience. We are all still learning. People need to be ready to learn, to find their niche and to recognise that AI is here to stay,” he says. What, then, is ANS doing to encourage this process and to attract AI talent into its ranks? “We’re incredibly involved in the Microsoft community space – I’m at events twice a month,

RkJQdWJsaXNoZXIy NDUxNDM=