There is no doubt that we live in a world of data. There are more channels than ever before that are either force feeding us information or collecting data from us. There is an incessant drive for data both in and out, and everyone is after analytical insights to inform their decision-making, sales pipelines, storytelling and more.
Volume of data seems to be taking precedence over the substance of the content and organisations are scrambling to make sense of the data, both big and small. So it makes sense that Artificial Intelligence would arrive on the scene as the white knight to help us slay the data dragons that we face.
But in the quest to find the most efficient way to combat these data dragons, is it possible that we are missing the larger plot?
Data is not just data, words are not just words, and sentiment is not just sentiment; they represent a person or people - whose ‘data’ has context and nuance. That person’s data is communicating something much more than what we might see at face value.
Have you not written to someone to say: “have a nice day”, and the tone and intent were very different – be it genuine, sarcastic, or evening goading? Has the other person reading the same words “have a nice day” not picked up on that change?
There is a human element that AI does not have, which is the ability to feel and build inference based on an experience of life and a form of intelligence that cannot be scrapped off the internet.
If two chefs working with the same ingredients can end up with two very different dishes, what is the ‘x’ factor in these chefs and what they bring to the table that we are outsourcing or excluding through our dive into all things AI?
There is little doubt that AI can save us time, and I am not suggesting AI has no role in the engagement process, but there are some real questions we need to ask, like: what ‘x-factor’ do we lose by handing over the responsibility for reading and understanding others to an AI program?
Of course, it is true to say that what makes us unique also makes us prone to bias, and it is true that feelings can certainly get in the way of understanding and decision making. There is value in stepping back to ensure you are being objective when there are heightened reaction and emotions. If it is advisable for surgeon and lawyers, then it is also fit for project managers, engagement professional or facilitators.
If you have even had the experience of navigating an ‘automated’ phone system or a ‘bot’ driven help desk, even the most perfectly framed response from a bot can still leave you feeling somehow poorer for the experience. I think that’s because we all crave human connection and can feel when the voice on the end of the line or, to some degree, the text on a page is human or AI generated.
So, on the one hand, I have the option of doing the work and taking responsibility for understanding others clearly, and evaluating how my own world view might get in the way of that; and on the other hand I have the option of outsourcing that responsibility to AI. At the moment, I am choosing the former. Why? Because I have not seen an AI program (and I have tested a few) that delivers something that has the ‘feel’ of the original source content, and I like people too much to want to avoid them and their views.
Comments