<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:g-custom="http://base.google.com/cns/1.0" xmlns:media="http://search.yahoo.com/mrss/" version="2.0">
  <channel>
    <title>First Pass: A Collection of Insights on the world of AI and MSPs</title>
    <link>https://www.lemhi.ai</link>
    <description />
    <atom:link href="https://www.lemhi.ai/feed/rss2" type="application/rss+xml" rel="self" />
    <item>
      <title>Why We Wrote an Open Source AI Framework For the Industry</title>
      <link>https://www.lemhi.ai/open-source-ai-framework-msp</link>
      <description>One of the reasons AI feels so messy in the MSP world is simple. There isn’t a real framework, not a shared one, not a practical one, mot something people can actually...</description>
      <content:encoded>&lt;div data-rss-type="text"&gt;&#xD;
  &lt;p&gt;&#xD;
    
         Write about something you know. If you don’t know much about a specific topic that will interest your readers, invite an expert to write about it.
        &#xD;
  &lt;/p&gt;&#xD;
&lt;/div&gt;&#xD;
&lt;div data-rss-type="text"&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          One of the reasons AI feels so messy in the MSP world is simple. 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          There isn’t a real framework.
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Not a shared one. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Not a practical one. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Not something people can actually ground decisions in. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          What exists instead is a mix of vendor narratives, half‑borrowed security models, and a lot of well‑intentioned guesswork. Everyone is trying to build structure at the same time they’re trying to figure out what AI even 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
          is
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
           in their business. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          That’s a hard way to operate.
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Most of what I see today isn’t really a framework. It’s paperwork layered on top of uncertainty. An attempt to look organized before there’s anything stable underneath it. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          And that’s not a knock on effort. It’s just what happens when there’s nothing solid to anchor to. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          This is actually why I ended up writing the Lemhi AI framework at all 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          . 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Not because I wanted to introduce 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
          another
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
           abstraction, but because there wasn’t one to start from. There was no common language. No baseline for what “good” even looked like. No way to evaluate tools without starting from scratch every time. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Everyone was picking tools first and trying to justify them later. That’s backwards. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Without a framework, every AI decision feels heavyweight. Every new tool creates debate. Every customer conversation turns into a custom explanation. And every internal discussion becomes philosophical instead of practical. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          A real framework does the opposite.
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          It gives you a place to stand. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          It makes tradeoffs obvious. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          It lets you evaluate tools 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
          against
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
           something instead of reacting to them emotionally or defensively. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Once we accepted that a framework was needed, the next decision was obvious. 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          It had to be open
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;span&gt;&#xD;
        
           . 
          &#xD;
      &lt;/span&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          If this lived behind a product, a paywall, or a consulting engagement, it would immediately lose credibility. It would feel like positioning instead of structure. Another opinionated take instead of a shared starting point. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          That was the opposite of the goal. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          The intent here is not to “win” the AI framework debate. It is to start it and open it to the community. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Open source forces discipline. Anyone can inspect it. Anyone can challenge it. Anyone can fork it. If something does not hold up in the real world, it gets exposed quickly. 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          That is a feature, not a risk.
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          It also keeps the framework honest. The moment it turns into a sales asset, it stops being useful as a control system. MSPs already have enough vendor shaped narratives telling them how AI should work. They do not need another one. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          So we gave it to the community and have decided to own changes. 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          My take on it? 
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
          You do not need to believe everything in it. You just need a place to stand. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          If you have spent time in cybersecurity, the structure will feel familiar. 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          That is intentional.
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          CIS works not because it is perfect, but because it respects how organizations actually adopt things. It recognizes that maturity is staged. That not every control matters on day one. That sequencing matters more than ambition. AI adoption follows the same pattern. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          There is a massive difference between “we are experimenting” and “this is now part of how work gets done.” Treating those two states the same is how organizations either freeze or move too fast. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          So instead of inventing something new, we copied the part that already worked. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          What can you expect? 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;h3&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Implementation Groups. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/h3&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          IG1: Baseline – What must exist before AI is considered real 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          IG2: Scale – What prevents drift as adoption grows 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          IG3: Advanced – What only matters once AI is embedded into sensitive workflows 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          This is not about slowing teams down. It is about giving them permission to start honestly where they are. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;h3&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Pillars 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/h3&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Pillars are not categories for organization. Each one maps to a failure mode we kept seeing in real environments. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Most AI problems are predictable. Missing ownership. Unclear data boundaries. No visibility. No rollback path. Pillars force teams to confront the parts they usually assume away. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;h3&gt;&#xD;
    &lt;span&gt;&#xD;
      
          What Each Pillar Represents 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/h3&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Each pillar answers a different “what breaks if we ignore this” question: 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          Strategy &amp;amp; Buy‑In
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           – Who owns AI and why it exists 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          Policy &amp;amp; Governance
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           – What is allowed, what is not, and how exceptions work 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          Technical Readiness
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           – Whether the environment can actually support AI 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          Process Mapping
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           – Where AI fits into real work, not demos 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          Data Security &amp;amp; Tagging
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           – What data AI can see and what it never should 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          AI Observability
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           – Whether usage, cost, risk, and quality are visible 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          Copilot Readiness
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           – How Microsoft Copilot expands safely and deliberately 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          AI Tooling &amp;amp; Deployment
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           – How pilots become production without chaos 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Skipping a pillar usually shows up later as noise, risk, or rework. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;h3&gt;&#xD;
    &lt;span&gt;&#xD;
      
          What’s Inside Each Control 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/h3&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Every control is written to be executable, not theoretical. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Each one includes: 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          A clear objective 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          A concrete requirement 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          A defined cadence 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          A named owner 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Evidence you can actually produce 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Controls are not pass or fail judgments. They are orientation points. They tell you what matters now, what can wait, and what you should not skip. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          The point of the framework is simple. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          AI should feel boring once it is working. Owned. Governed. Measured. Improved over time. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;br/&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          If it does not, something upstream is missing. 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
&lt;/div&gt;</content:encoded>
      <enclosure url="https://irp.cdn-website.com/929fe5ec/dms3rep/multi/Topo-Patterns_Dark-Forest+Ink.jpg" length="169378" type="image/jpeg" />
      <pubDate>Wed, 25 Mar 2026 00:15:59 GMT</pubDate>
      <guid>https://www.lemhi.ai/open-source-ai-framework-msp</guid>
      <g-custom:tags type="string" />
      <media:content medium="image" url="https://irp.cdn-website.com/929fe5ec/dms3rep/multi/Blog+Thumbnail+Concept+1+%283%29.png">
        <media:description>thumbnail</media:description>
      </media:content>
      <media:content medium="image" url="https://irp.cdn-website.com/929fe5ec/dms3rep/multi/Topo-Patterns_Dark-Forest+Ink.jpg">
        <media:description>main image</media:description>
      </media:content>
    </item>
    <item>
      <title>Most MSP Problems Aren’t Technical, and AI is the Least of IT All (In your customer’s mind...)</title>
      <link>https://www.lemhi.ai/msp-problems-ai</link>
      <description>The more conversations I have with MSPs about monetizing AI, the less convinced I am that their biggest problems are technical. They FEEL technical...</description>
      <content:encoded>&lt;div data-rss-type="text"&gt;&#xD;
  &lt;p&gt;&#xD;
    
         There are so many good reasons to communicate with site visitors. Tell them about sales and new products or update them with tips and information.
        &#xD;
  &lt;/p&gt;&#xD;
&lt;/div&gt;&#xD;
&lt;div data-rss-type="text"&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          The more conversations I have with MSPs about monetizing AI, the less convinced I am that their biggest problems are technical. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          They 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
          feel
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
           technical. They show up as tool debates, platform decisions, AI comparisons, and architecture questions. What bothers me is everyone treats this AI problem like an engineering problem that just need better tools. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          But that’s not actually where things break down. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          What I hear most often isn’t “this tool doesn’t work.”
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           It’s “we’re not sure what to use.” 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           Or “we’re still testing a few things.” 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           Or “we don’t really know how to talk about this with customers yet.” 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          That last part matters more than people want to admit.
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          A lot of this came up in recent coffee chats. Someone will say they’re looking at Copilot, but also using ChatGPT, and then another AI product their vendor just showed them. They’re trying to decide which one to standardize on, whether they should offer multiple options, or whether they should even be selling AI at all yet. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          None of that is a technical limitation.
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;strong&gt;&#xD;
      
           It’s a clarity problem.
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          MSPs don’t lack tools. 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          They lack conviction.
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Every option sounds plausible, and in a lot of cases they ARE plausible. Every vendor has a story. Every demo works in isolation. And because everything 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
          might
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
           be important, nothing gets fully committed to. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          So teams keep evaluating. They keep experimenting. They keep waiting for the moment when it all becomes obvious. 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
          And in the meantime, selling AI feels hard.
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          But that is absolutely not because customers don’t want it. But because MSPs don’t know how to explain it without talking tools. 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          That’s the part I think a lot of people miss.
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           Selling AI isn’t hard because the technology is complex... iIt’s hard because the narrative is unsettled. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          If you’re not clear on what AI 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
          is
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
           in your stack, what problem it actually solves, and where its limits are, then every sales conversation turns into a ramble. You hedge. You over‑qualify. You list tools instead of outcomes. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Customers feel that. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;span&gt;&#xD;
        
           And when the story isn’t clear, trust doesn’t form. Deals stall. AI gets positioned as “interesting” instead of “necessary,” or they go on their own way and solve their own problems. AI isn’t a particularly difficult one to self-service, so that’s the path of least resistance. 
          &#xD;
      &lt;/span&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          That’s not a sales failure. That’s a prioritization failure. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Most MSPs don’t need better pitch decks or smarter demos... They need stronger filters. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          They need to decide what they believe. They need a default answer. They need to say no to a lot of things so the yes actually means something. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          That’s where ecosystems start to matter.
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          An ecosystem reduces choice. It forces consistency. It gives your team shared language instead of a dozen different explanations depending on which tool someone last tried. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          When the internal story stabilizes, selling gets easier. Not slicker just clearer. Most MSP problems aren’t technical. And most MSP AI sales problems aren’t either. 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          They’re narrative problems. They’re clarity problems. They’re commitment problems.
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          AI didn’t create that. It just exposed it. And until that’s addressed, no new tool is going to make AI easier to sell. It’ll just add another option to an already crowded list. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          This is where my skepticism keeps leading me.
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          The real question isn’t 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
          how do we sell AI? 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
          It’s 
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
          what are we actually willing to stand behind?
         &#xD;
    &lt;/span&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Once that’s clear, the rest starts to quiet down. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;span&gt;&#xD;
        
           ﻿
          &#xD;
      &lt;/span&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
&lt;/div&gt;</content:encoded>
      <enclosure url="https://irp.cdn-website.com/929fe5ec/dms3rep/multi/Topo-Patterns_Dark-Forest+Ink.jpg" length="169378" type="image/jpeg" />
      <pubDate>Wed, 18 Mar 2026 00:16:00 GMT</pubDate>
      <guid>https://www.lemhi.ai/msp-problems-ai</guid>
      <g-custom:tags type="string" />
      <media:content medium="image" url="https://irp.cdn-website.com/929fe5ec/dms3rep/multi/Blog+Thumbnail+Concept+1+%281%29.png">
        <media:description>thumbnail</media:description>
      </media:content>
      <media:content medium="image" url="https://irp.cdn-website.com/929fe5ec/dms3rep/multi/Topo-Patterns_Dark-Forest+Ink.jpg">
        <media:description>main image</media:description>
      </media:content>
    </item>
    <item>
      <title>In an AI World Full of Noise, I’m Being Skeptical on Purpose</title>
      <link>https://www.lemhi.ai/ai-noise-skeptic</link>
      <description>I’m not anti‑innovation. I build things for a living. I like new ideas. I like progress. I like when technology actually moves the ball forward. What I’m against is noise...</description>
      <content:encoded>&lt;div data-rss-type="text"&gt;&#xD;
  &lt;p&gt;&#xD;
    
         The new season is a great reason to make and keep resolutions. Whether it’s eating right or cleaning out the garage, here are some tips for making and keeping resolutions.
        &#xD;
  &lt;/p&gt;&#xD;
&lt;/div&gt;&#xD;
&lt;div data-rss-type="text"&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          I’ve been labeled skeptical a bit recently around AI. I’m fine with that. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          But it’s worth saying what that skepticism actually is (and what it isn’t). 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          I’m not anti‑innovation. I build things for a living. I like new ideas. I like progress. I like when technology actually moves the ball forward. What I’m against is noise. And right now, MSPs are drowning in it. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Every cycle brings a new framework, a new model, a new set of tools, a new abstraction layer that promises to “change everything.” The language is confident. The diagrams are clean. The demos are impressive. And yet, when you step back, a lot of it doesn’t survive contact with reality. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          That’s where my skepticism comes from. I’ve been in this space for 15+ years. I know what the beginning of a cycle looks like. It often follows the same patterns and the same experimentation. And for a decade and a half, it’s mostly landed in the same ending position, whether cloud, cyber, or AI: repeatable and monetizable at scale. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;span&gt;&#xD;
        
           I don’t start by believing vendors. I don’t start by assuming the abstraction is necessary. I don’t start by trusting that because something is popular, it’s useful. I start by asking a much more boring question:
          &#xD;
      &lt;/span&gt;&#xD;
    &lt;/span&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          Does this actually hold up when you try to run it, scale it, support it, and charge money for it? 
         &#xD;
    &lt;/strong&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Most things don’t fail because they’re bad ideas. They fail because they’re fragile, expensive, hard to explain, or impossible to operationalize. Or they only work under perfect conditions that never exist outside a demo. And in today’s era, with the break neck speed that things get done, none of that is acceptable. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          I think a lot of people miss this. Especially in AI. 
         &#xD;
    &lt;/strong&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          There’s a temptation to treat intelligence as magic instead of infrastructure. To stack more layers, more orchestration, more cleverness on top, and assume value will appear. But intelligence that can’t be repeated, governed, or monetized isn’t progress. It’s a science project, and its irresponsible if you’re doing it in your customers environments. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Healthy skepticism is how you protect yourself from that. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          It forces you to slow down and separate what’s interesting from what’s durable. What sounds smart from what actually compounds. What helps one team ship a demo from what helps an organization operate at scale. And it’s the exercise you need to stop feeling so overwhelmed with the noise. 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          This is where my head goes with it:
         &#xD;
    &lt;/strong&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;span&gt;&#xD;
        
           if something can’t be explained simply, deployed repeatdely, and improved incrementally, it’s probably not ready. 
          &#xD;
      &lt;/span&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;span&gt;&#xD;
        
           So yes, I’m skeptical. On purpose. Because skepticism is how you calm the noise.
          &#xD;
      &lt;/span&gt;&#xD;
    &lt;/span&gt;&#xD;
    &lt;strong&gt;&#xD;
      
          And once the noise is gone, the real work can start. 
         &#xD;
    &lt;/strong&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
           
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      
          Look forward to sharing more of my learnings soon! (They’ll be a little less skeptical, I promise) 
         &#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
  &lt;p&gt;&#xD;
    &lt;span&gt;&#xD;
      &lt;br/&gt;&#xD;
    &lt;/span&gt;&#xD;
  &lt;/p&gt;&#xD;
&lt;/div&gt;</content:encoded>
      <enclosure url="https://irp.cdn-website.com/929fe5ec/dms3rep/multi/Topo-Patterns_Dark-Forest+Ink.jpg" length="169378" type="image/jpeg" />
      <pubDate>Wed, 11 Mar 2026 00:16:00 GMT</pubDate>
      <guid>https://www.lemhi.ai/ai-noise-skeptic</guid>
      <g-custom:tags type="string" />
      <media:content medium="image" url="https://irp.cdn-website.com/929fe5ec/dms3rep/multi/Blog+Thumbnail+Concept+1.png">
        <media:description>thumbnail</media:description>
      </media:content>
      <media:content medium="image" url="https://irp.cdn-website.com/929fe5ec/dms3rep/multi/Topo-Patterns_Dark-Forest+Ink.jpg">
        <media:description>main image</media:description>
      </media:content>
    </item>
  </channel>
</rss>
