The Plastic Banana Slicers of AI Tech
Have you heard of ChatGPT wrappers? No, they’re not chart-topping AI hip hop artists, but rather a pejorative term taking aim at a crop of AI tools that have ‘wrapped’ a large language model, like GPT, in a user interface that makes it look spiffier than it really is. In the weeks following ChatGPT’s initial launch in November 2022, hundreds if not thousands of these companies popped up seemingly overnight, many of which have gone under in the months since.
For years, creating the front end of software, the user interface you and I click and tap, has been getting easier with templated and no-code software development platforms that allow even people with limited technical ability to build websites and apps. The tricky thing that remained was how to make the app do something interesting or useful.
Enter GPT. The incredible flexibility of OpenAI’s game-changing AI model could instantly animate an app imbuing it with all kinds of abilities that would have ordinarily taken a team of engineers weeks if not months and hundreds of thousands of dollars to build. Folks with a bit of know-how could suddenly whip up a decent looking app with functionality over a weekend, and thousands did just that.
Alas, even with powerful AI, there is still no free lunch. The hidden consequence of such frictionless software development was that it allowed many teams to bypass the normal process of learning about the problems they were trying to solve. Before GPT, even if the eagerest beaver wanted to deploy a software solution to a problem they just learned about yesterday — it wasn’t possible. They had to spend time hacking together the solution and in the process learned how to solve the problem in an elegant way. GPT removed the crucial barrier of time enabling brash builders to launch all kinds of products with only the shallowest grasp of the problems they intended to solve.
If you’ve signed up to use one of these tools (assuming it has solid data security and reputable provenance) it’s not the end of the world, but allow me to encourage you reconsider. AI wrappers are like single-function kitchen utensils. For example, I present to you — the plastic banana slicer!
Where do I begin… Well, let’s start out with the fact that this thing probably can’t even slice all bananas, like those that don’t conform to its idealized banana shape. Instead of applying force via a single blade, this contraption distributes pressure across 17 cutting surfaces, thereby encouraging more of a smushing, rather than a slicing effect. How annoying must this thing be to clean between the slats? Lastly, of all the foods human beings eat in this world, the banana may actually be the single easiest of them to cut. When did chopping bananas become a problem at all, and what the #$@ is wrong with using a knife? Rant concluded.
I should reel myself in before I go off on the garlic press. Yup, thems fightin’ words. The point is a knife or even a sharpened stone has always been able to do what the banana slicer proclaims to have revolutionized along with much, much more. These contraptions are actually just more foundational and useful tools encumbered with plastic edifice that limits them down to a single obvious use.
Similarly, these rapid fire AI softwares that mushroomed up are essentially the plastic banana slicers of AI tech. They may help you do that one thing, but should you spend your money on them, or avoid learning how to use more versatile tools at your disposal?
Now, take the mandolin, a cutting instrument that allows the user to move the food instead of the knife to achieve rapid and uniform cuts of precise thickness. A mandolin is not just another gimmicky, less functional knife, it’s a blade that has been adapted to give cooks augmented abilities. There are AI ‘mandolins’ out there, startups thoughtfully applying AI to truly augment human ability to perform not only discrete tasks, but entire jobs more efficiently and effectively.
Does the tool solve a broad range of problems or just one specific task?
AI wrappers often focus on a single, narrow function, while useful AI tools offer flexibility for various tasks.
Is the tool designed with a deep understanding of the problem it aims to solve?
Can the tool adapt to new or unexpected challenges?
Does using the tool contribute to long-term skill development and problem-solving abilities?
Is the AI applied in a thoughtful way that genuinely augments human capabilities?
I do love when I stumble upon a nice culinary analogy for concepts in technology. At the end of the day, will I judge you a bit if I see a banana slicer in your kitchen drawer? Yes, yes I will. But we will absolutely remain friends, and what really matters is how the dish you’re making delights and nourishes us with those 17 mushy slices of banana.
Likewise, most junk AI software built too quickly for its own good is not malicious or dangerous, but they can be wasteful and may inhibit gaining a deeper understanding of generative AI and its capabilities. I don’t have to tell you nonprofit pros that our mission-driven line of work is no joke, that its often like navigating unforgiving wilderness. Well then, would you rather be out there armed with a sharp knife, or may I interest you in pair of pizza scissors?
Thanks for reading this edition of The Process, a free weekly newsletter and companion podcast discussing the future of technology and philanthropic work. Please subscribe for free, share and comment.
Philip Deng is the CEO of Grantable, a company building AI-powered grant writing software to empower mission-driven organizations to access grant funding they deserve.
ncG1vNJzZmiomJ65qrzDnqWgZqOqr7TAwJyiZ5ufonyxe8CiZLCqkaW9pr7SZquhnV2luaK%2F06KaZpqRo66vrYyso6KblafA