technologyreseller.co.uk 33 AI running reports to see where data is, what data is available, and who has access to that data. That’s how you can start looking at how to minimise risk. I call it ring fencing. How do I ring fence areas of that data? How do I restrict certain things and not others? That’s the key. If you can get that right, you’re on a good journey to using AI.” Customer concerns Data security and data leakage are big concerns raised by customers. Another is fear over digital equity or inequity, if one employee is given AI and another isn’t and earns productivity bonuses as a result. “The primary concern,” says Huntingford, “is that AI is just a fad. I can remember this one time when Apple released this tiny device with a screen and that was a fad. Now all of a sudden, we live in them; we use these devices to make payments, to get onto airplanes, to manage our diaries. I feel this is where we are now with AI. We’re in the Nokia flip phone era of AI. And the thing that’s not valuable in AI is the thing that wasn’t of 50 things and store your sketchy pictures in that bottom folder. Copilot acts as a loud hailer for terrible data. And what we’re finding is that when organisations turn it on, they find this out very fast and quickly turn it off. It’s got nothing to do with data leaking out. It’s actually data being shared inwards that’s the biggest problem.” Addressing the problem So, what should organisations do to overcome this problem? “I’m going to give you three things that I ask organisations all the time. Number one, where is your data? Can you categorically tell me where it is? Number two, what data do you have? What data is stored in your ecosystem? Number three, who has access to your data? And most people I talk to cannot answer those three questions. “When you use a visibility tool like Microsoft Purview, if you are legally allowed to, you can see everything, every Teams message I’ve sent to somebody, every document I’ve shared. You can then start members understand they will be liable for treating AI terribly.” Huntingford’s example of asking Copilot about salary details highlights the fear that many organisations have about AI and data security, although he suggests that this is more of an internal risk than an external one. “The way I’m going to frame this is Microsoft’s pitbulls, ninjas and barbed wire fences are bigger than anyone’s in the world. If you are using Copilot on your ecosystem that is localized in your tenant, no one else can get that data. It doesn’t leak out. That’s not the problem. The problem is over-sharing inside your tenant. Let me give you an example. If I have a document that says ‘salary raises’ and I put it on my desktop and I share it with Johnny, Johnny will have access to that document. But say Johnny decides to share it with Rebecca and he clicks Share, all of a sudden your entire company has got access to that data directly through Copilot. So security by obscurity does not exist any longer. You can’t have a nested folder structure continued...
RkJQdWJsaXNoZXIy NDUxNDM=