Scott Plewes
Chief Strategy Officer
What is Your AI’s EQ?
Artificial Intelligence (AI) can walk a strange line between people and technology. On the one hand, it is “just computing.” On the other, its responses engender people to ascribe some human aspects to it in assessment ( “I don’t trust it” ) and behaviour (“it is friendly”). Because of this, you need to determine how people will assess your AI’s emotional intelligence (EQ). The higher the AI EQ is perceived, just like with people, the more likely it will be successful working with the AI.
Other things to know:
- What we mean by EQ is a notion that the AI responds to — or fails to respond to — human queries and direction that mirrors that of an empathetic individual.
- The more the interaction feels human-like (“chatting” with the AI, for instance), the more emphasis the human evaluator will put on the AI’s EQ.
- Like a person’s role, the requirements of the AI depend on its role in the organization. If it has frequent and diverse human contact, it will need a higher, or at least more adaptable, EQ than a small, targeted group.
- Besides testing where an AI responds factually correctly, there’s no reason you can’t test how people felt about individual or overall AI responses, which can then be adjusted for.
- Your AI’s “personality” will be one of your brand characteristics. You might as well be intentional about this more emotional aspect of your brand.
Scott Plewes
Chief Strategy Officer
Over the past twenty-five years, Scott has worked in the areas of business strategy, product design and development in the high tech sector with a specialization in experience design. He has extensive cross-sector expertise and experience working with clients in complex regulated industries such as aviation, telecom, health, and finance. His primary area of focus over the last several years has been in product and service strategy and the integration of multi-disciplinary teams and methods.
Scott has a master's degree in Theoretical Physics from Queen's University.