Andi Smith

Technical Leader Product Engineer AI Consultant

Do You Really Need Your Own AI Model?

- by Andi Smith

With advancements in AI moving so fast, is it worth the return on investment?

We've seen a real change in AI over the last year or so. OpenAI's release of GPT-4 really changed the AI landscape; Claude's Sonnet 3.5 which released in June is equally impressive; and Google's Gemini is also not too far behind.

The speed at which these companies are moving forward and upgrading their AI offerings is staggering. New features and new releases are available frequently which substantial new features such as OpenAI's Assistant Threads and Claude's Projects and Artifacts.

Whilst building your own AI model seemed like a viable option just two years ago, now it feels like something that is difficult to recommend. Any development is going to become quickly out of date without serious investment. And the returns on that investment are going to be difficult to achieve unless your moat is wide and use case is incredibly unique.

Supplementing knowledge

Instead, using techniques such as Retrieval-Augmented Generation (RAG) seem to be more efficient approach. RAG supplements the AI model with additional information through either structured documents or a vector database to allow the AI to be better informed about the context in which the user is interacting with AI. For example, if your AI was providing career advice to the user you may use RAG to supply the users' work history, some information about the companies they have worked for and information about the job role they are applying for.

With these techniques, you can get the best of both worlds - all the updates and model improvements from the heavy hitters with the context from your own business - and at a fraction of the cost.

When should we use a model?

There are of course times when using a model is still worth it. You should consider using a model when:

  • You have highly specialised domain data that existing models do not understand
  • You need complete control over the model's behaviour
  • You have strict privacy requirements

Removing the stigma of wrappers

Over the last 12 months there's been a bit of a stigma put on AI wrappers. A feeling that if you are not using your own model then what is the moat of your business. But I feel this stigma is unfair.

If we think about a lot of successful products, quite often what we find is that users are looking for convienence. If you can cut the time down that it takes a user to complete a task (and you market it well), then they really don't care about how you are doing it.

At this point in time, combining AI wrappers with Retrieval-Augmented Generation feels like a very valid way to build a successful business; doesn't require deep machine learning expertise and has a much shorter time to market. The challenge from an entrepreneurial point of view is to find a pain point that people would be willing to pay money toward fixing and finding a way to distribute and market your application.

And that, dear reader, is up to you.