Top Ten Guide to Knowledge Curation in the Era of AI
Practical Guidelines for Curating Knowledge That Lives Outside the LLM
Knowledge curation has always required clarity, discipline, and persistence. AI brings speed, pattern recognition, and reach—but only if it’s paired with well-designed human practices. Here’s how to operationalize curation so that it produces visible, ongoing value.
1. Define Knowledge and Align with Strategic Objectives
Run a definition workshop: Bring together representatives from strategy, operations, and frontline teams to define what counts as “knowledge” in your context—data, documents, process maps, lessons learned, expert profiles, etc. Capture examples in a shared reference guide.
Prioritize with a relevance matrix: Create a two-axis grid—strategic impact vs. operational necessity. Place each knowledge asset category on the grid to identify what to curate first.
Use AI to map coverage gaps: Feed AI a sample of strategic objectives and relevant operational documents. Use it to suggest missing topics or under-documented processes.
2. Design User-Centric Content for Relevance and Findability
Interview users, not just managers: Ask how they search for information, what terms they use, and where they expect to find things.
Develop a “findability checklist”: Before publishing, verify metadata accuracy, use of plain language, appropriate tagging, and cross-linking to related items.
Implement AI-assisted search tuning: Use AI analytics to monitor search queries with poor click-through rates, then adjust metadata or create new content to meet those needs.
3. Establish Robust Content Quality and Governance
Assign asset ownership: Every item in your knowledge base has an “owner” responsible for reviewing and updating it at least quarterly.
Set review dates and automate reminders: Use workflow tools (AI-assisted where possible) to flag content approaching review deadlines.
Score quality: Rate each asset on accuracy, completeness, clarity, and currency. Track scores over time and retire assets that consistently fail.
4. Balance Explicit Codification with Tacit Knowledge Leveraging
Capture tacit knowledge through “micro-harvests”: After key meetings, record 5-minute summaries from subject matter experts, then transcribe and store them with metadata.
Build expert locators: Maintain searchable profiles with areas of expertise, recent projects, and preferred contact methods.
Use AI transcription + tagging: Let AI handle transcription, auto-tag recorded conversations and link them to related explicit knowledge assets.
5. Foster a Culture of Knowledge Sharing and Trust
Integrate sharing into performance reviews: Track meaningful contributions (and effective reuse) to knowledge as part of annual evaluations.
Offer micro-recognition: Post “Knowledge Shoutouts” on community homepages for valuable contributions. You can also offer other forms of recognition, such as gift cards or points toward experiences, including attendance at industry events.
Run safe-to-share workshops: Practice handling sensitive or incomplete knowledge so people are comfortable contributing without fear of error.
6. Embed Knowledge Curation into Daily Workflows and Processes
Add capture points to workflows: In project templates, include a “knowledge assets created” field that must be completed at closure.
Link curation to change management: When a process changes, trigger an automated search for related documents and notify owners to update.
AI nudges: Use AI to prompt users to save, tag, or update content when it detects high-value interactions (e.g., solving a problem in chat).
7. Leverage Technology
Select tools after defining use cases: Example: “We need AI to summarize meeting recordings and extract action items,” rather than “We need the latest AI platform.”
Integrate into existing tools: Add AI-powered search to SharePoint or Slack instead of introducing another standalone tool.
Pilot, measure, scale: Run small experiments, measure impact on retrieval time or quality, and only then roll out more broadly.
8. Nurture Communities of Practice (CoPs) and Knowledge Intermediaries
Give CoPs a purpose charter: Define who they serve, what types of problems they solve, and how often they meet. (See more on nurturing CoPs here).
Assign knowledge stewards: These intermediaries help identify key insights from discussions, tag them, and link to related content in the knowledge base. They also support knowledge curation activities conducted by others.
AI-assisted summaries: After each community session, AI produces a topic summary, recommended reading list, and related experts to contact.
9. Measure and Demonstrate Value Continuously
Link metrics to business outcomes: Example: “Time to resolve customer issues dropped from 5 days to 3” rather than “We added 50 new documents.”
Core metrics:
% of content up-to-date (define what “up-to-date” means, such as how often it was updated).
Average retrieval time
Active users as a percentage of the employee base
Cross-team reuse rates (tracked by asset IDs in different contexts)
Report quarterly: Share simple, visual reports showing KM’s contribution to speed, quality, revenue generation and cost savings.
10. Embrace Continuous Adaptation and Learning
Run quarterly knowledge audits: Identify emerging topics, under-used assets, and outdated content.
Test and learn: Pilot new curation methods (e.g., AI auto-tagging) with a small group, collect feedback, refine, then expand.
Encourage “after-action” learning: Immediately after major projects or events, capture what worked, what didn’t, and what should change—then feed it into the knowledge base.
Monitor language shifts: Use AI to flag when users start searching for new terms or synonyms so that the taxonomy evolves alongside the business.




Martin, thank you for the observation. I agree, but I wonder how you think about accountability rather than responsibility? Everyone may be responsible, but if errors occur, who is accountable? Is there any process for seeing what’s been updated? My issue with contact centers is that there is always too much information, but that information comes from somewhere--meaning it was too much to create, but it becomes too much to maintain. From a process standpoint, I think we under-value that knowledge, and therefore, understaff critical functions. That said, it is likely a place where AI will play a role as changes create patterns. AI could look at changes and suggest those that require review, for whatever reason. I’m interested in seeing how that plays out.
These are all terrific recommendations. One thing that has been a consistent struggle for my clients in the customer service contact center space is assigning an owner to each knowledge asset. When your knowledge base has hundreds (if not thousands) of articles, requiring each one to be owned isn't scalable. That's why I guide clients toward a Knowledge Centered Service approach, which empowers every knowledge user to act as an owner by either flagging or updating content in real time when they learn that something has changed.