Should AI Nudge You or Tell You What to Do?

Sep 17, 2025 6 Min Read
notification bell reminder
Source:

Mamewmy from Freepik

Why even accurate AI advice can have surprising costs

AI signals, such as alerts or recommendations, are increasingly used to help people make decisions. Across various industries, AI monitors ongoing events, detects when a decision is critical and jumps in, sometimes with a subtle alert, other times with an explicit recommendation. In finance, fraud detection tools flag suspicious transactions for review. On factory floors, AI tools alert operators to potential defects. In IT, error alerts prompt engineers to check for bugs.

In general, AI guidance can fall into one of two categories: attention signals and action signals. Attention signals flag decisions that are important without offering a recommendation: “This is a critical decision: pay close attention.” Action signals go further and prescribe a specific action: “Here’s what you should do.”

In practice, both are widely used. In a hospital, for instance, an algorithm might alert a doctor that a patient’s vital signs are worsening with an attention signal that says, “Something’s wrong, take a look.” Or it might give an action signal, telling the doctor exactly what to do with a specific diagnosis and treatment recommendation.

But which type of signal actually helps us make better decisions, especially when the AI is reliable and provides highly accurate advice? This question is increasingly relevant, as AI tools become better calibrated and consistently dependable. As this trend continues, we must ask: Are there costs to relying on AI too much, even when its advice is correct? We explored these questions in a study using chess – a setting where AI recommendations are trusted, accuracy is exceptionally high and decision quality is easy to measure.

We ran a large-scale behavioural experiment with approximately 300 chess players from 55 countries. Participants ranged from amateur enthusiasts to elite professionals, of which 36 held official chess master titles.

Related: To Use AI Smartly, Think Like a Strategist

Players competed in full chess games under three conditions. In some games, they received action signals, where our AI tool revealed the best move in certain critical moments. In others, they received attention signals, which flagged those same moments as important but offered no move recommendation. And in some games, they played without any AI help.

We then analysed how the different signal types affected performance at the moment the signal was delivered, in the moves that followed and in overall game outcomes.

Unsurprisingly, action signals, which revealed the optimal move, helped players make the best move in the moment. But they came with a hidden cost: performance declined in the moves that followed. Players made quick decisions when the AI told them exactly what to do, but once they no longer had guidance, they struggled. In subsequent moves without AI input, they made more mistakes, took longer to decide and found it hard to regain their rhythm.

We call this the “uncharted waters” effect. Essentially, the AI’s guidance, while initially helpful, disrupted players’ cognitive flows, reducing their effort and leaving them less prepared for what came next.

Attention signals had a different effect. Rather than offering answers, they prompted players to slow down and think more carefully. This improved the immediate decisions – though not as much as action signals – but also led to stronger play in the moves that followed.

Both types of signals outperformed having no AI support at all, although they affected the decision-making process in fundamentally different ways. Action signals acted as a substitute for human judgement, whereas attention signals worked as a complement, encouraging deeper engagement and effortful thinking without overriding the players’ decisions.

I got lazy and just trusted the (action) signals without calculating. - Anonymous study participant

Alt

Source: Macrovector from Freepik

What does this mean for decision-making in business?

Action signals, even when highly accurate, can be risky. The problem isn’t the decisions people make when the AI gives advice – those often improve. The risk comes afterwards. For example, in financial services, an AI tool might correctly recommend rejecting a loan application based on a risk model. But over time, if a loan officer becomes used to relying on such recommendations without reflection, they may struggle when faced with similar applications without AI input – such as edge cases the AI model wasn’t trained to handle or situations where the system fails to flag a case.

In these moments, the officer may feel unprepared to reason through the decision on their own. Worse, they may stop engaging with the decision process altogether. Over time, this over-reliance can displace human judgement.

But action signals can be critical when speed matters more than reflection. In aviation safety systems, for example, pilots may receive an alert recommending a rapid course correction to avoid a mid-air collision. There isn’t time to weigh alternatives, the pilot must respond immediately. In these situations, acting fast is more important than thinking deeply.

Attention signals, by contrast, help people stay engaged and think more critically, especially when decisions unfold over time. Rather than telling people what to do, they prompt closer inspection. In retail operations, for instance, an AI dashboard might flag a sudden drop in sales for a particular product. It doesn’t suggest a specific action, but it draws the manager’s attention, prompting them to investigate, check inventory or look at competitor pricing before deciding how to respond. This preserves human judgement and allows the decision-maker to adapt to the situation.

Related: Why AI Breaks Without Leadership Maturity

One signal doesn’t fit all

When deciding whether to offer an AI signal, and what kind, managers need to weigh three key factors: the quality of the signal, the expertise of the user and the context in which the decision is made.

Start with the signal itself. Is it reliable? Action signals can be helpful when accurate, but they carry greater risk when based on incomplete or noisy data. Even in a domain like chess, where the AI’s recommendations were highly accurate, our study uncovered the “uncharted waters” effect: players performed worse after following action signals. In real-world business environments, where models are rarely perfect, that risk is even greater.

Next, think about the person using the system and their level of expertise. We found that expert players made far better use of attention signals than novices. For them, attention signals delivered over 70 percent of the benefit offered by action signals. Less experienced players, by contrast, benefitted less from minimal guidance and often needed more explicit support.

Finally, consider the environment. Is the person under time pressure to make a decision? Are they mentally depleted or unable to invest sufficient cognitive effort? Attention signals require time, focus and effort to be effective. When speed is critical or the decision-maker’s bandwidth is low, action signals may be more effective, as long as they’re reliable.

There’s no one-size-fits-all answer. The value of an AI signal depends not only on its accuracy, but also on when it’s used, who it’s guiding and whether it helps people think, or just makes it easier not to. As AI signals are increasingly used to support human decision-making, understanding these trade-offs can enable us to deploy them more safely and effectively.


Step into the room where Asia’s future leaders connect. Register for ELC 2025 today.

Edited by: Katy Scott

Share This

Alt

Stefanos is a PhD Student in Decision Sciences at INSEAD.

Alt

Haosen is a data scientist at the Wharton AI & Analytics Initiative at the Wharton School, University of Pennsylvania.

 

Alt

Hamsa is an Associate Professor of Operations, Information, and Decisions at the Wharton School, University of Pennsylvania.

Alt

Osbert Bastani is an Associate Professor of Computer and Information Science at the University of Pennsylvania.

Alt

You May Also Like

a team guided by a leader to success

How Great Leaders Bring Core Values To Life

By William Arruda. Core values mean nothing without action. What’s the secret to making them real?

Feb 25, 2025 4 Min Read

authentic leadership

Embracing Authentic Leadership To Unlock Your True Potential

Rashmi Menon, Country Managing Director of Leaderonomics Services Malaysia, speaks on the importance of authentic leadership in today's fast changing world, key characteristics and qualities of authentic leaders, and strategies for building trust with team members and colleagues.

Jul 30, 2023 24 Min Podcast

Be a Leader's Digest Reader