Avoiding Common Pitfalls Using AI
In a recent survey, AI in project management was my readers’ top topic to explore further. AI’s benefits are often highlighted, so let’s look at potential pitfalls using AI in project management. For an ironic twist, I asked an AI tool to identify the five most significant traps.
Beyond best practice — required practice! Any output from an AI tool needs human review for clarity, context verification, and refinement. To show this, you’ll see the AI tool’s answers along with my human interpretation and insights. The My insight entries either clarify what wasn’t clear in the AI response or expand what AI identified to other scenarios. Overall, the AI response identified the high-level concerns I had thought of. However, I didn’t consider it complete. Of course, I could have asked AI to provide more detail on each high-level topic to see if it provided some of the insights I identified.
- AI-identified pitfall: Overreliance on AI. Project managers may become too reliant on AI tools and overlook the importance of human decision-making. It’s important to remember that AI tools are only as good as the data they are trained on and the algorithms they use. They can’t replace human intuition and experience.
- My insight: AI also can’t “read a room.” Sure, there are great tools to take notes during meetings, track and assign action items and check calendars. What they can’t do is tell whether a person is happy with an assignment. They also can’t tell if people are content with the meeting. They aren’t detecting body language (yet!). So, they only tell part of the story.
- AI-identified pitfall: Data quality issues. AI tools require high-quality data to function effectively. If the data is incomplete, inaccurate, or biased, the AI tool’s output will be flawed. (My aside: Yep, garbage-in, garbage-out is still true.) Project managers need to ensure the data they’re using is high-quality and free from bias.
- My insight: In addition, you need to review the sources the AI tool uses to produce its responses. One way to minimize data-quality issues is to include mandatory experience levels in your AI prompts. For example, “As a project manager with over 20 years of experience in delivering projects in the information technology sector…” then ask for the information you’re looking for. This approach helps improve response relevance and accuracy, but you still need to review the output.
- AI-identified pitfall: Lack of transparency. Some AI tools are “black boxes” that don’t offer any insight into how they arrived at their output. This lack of transparency can make it difficult for project managers to understand how the AI tool is making decisions and to identify potential issues.
- My insight: A simple rule – don’t use AI output without understanding the sources the tool referenced. Most AI tools are now providing specific (and actual) references as part of their response. Check those references to ensure their context is suitable for what you are trying to do.
- AI-identified pitfall: Inadequate training. AI tools require significant training to function effectively. Project managers need to ensure that they and their team members are adequately trained in the use of the AI tool and that they understand its limitations.
- My insight: Using AI to produce something you don’t fully understand is another version of inadequate training. I heard a story where a web developer used an AI tool to produce code in a language that he had not learned. While initially it appeared to work, errors arose. The developer didn’t have any idea how to debug the code, and refined AI queries to correct the code were unsuccessful. His reputation with his client was tarnished because he didn’t appreciate the need for proper training.
- AI-identified pitfall: Ethical concerns. AI tools can raise ethical concerns, particularly around issues of privacy and bias. Project managers need to ensure that they are using AI tools in an ethical manner and that they are not inadvertently perpetuating biases or violating privacy laws.
- My insight: Do your homework! Understand your local laws. They are changing quickly around the world. The safest thing is to analyze the output from AI tools, verify its accuracy and how much it has “spit out” the work of others. Make changes or don’t use the output without seeking permission from the referenced author. Another approach is to apply the output to your own stories and experiences. For example, if AI output recommends that you need to beef up project requirements, document that need by conveying your own stories where requirements were inadequate. That way, you use your intellectual property, not someone else’s who didn’t provide permission to do so.
Do you see other pitfalls to using AI in project management? Rules for how to make AI work effectively in your work? Share with us in the comments section.
For more about AI, check out Dave Birss’ How to Research and Write Using Generative AI Tools course. Or search LinkedIn Learning for his name to find all his courses.
_______________________________________
This article belongs to the Bonnie’s Project Pointers newsletter series, which has more than 62,000 subscribers. This newsletter is 100% written by a human (no aliens or AIs involved). If you like this article, you can subscribe to receive notifications when a new article posts.
Want to learn more about the topics I talk about in these newsletters? Watch my courses in the LinkedIn Learning Library and tune into my LinkedIn Office Hours live broadcasts.