How ‘Intelligent’ NFTs Raise Some Red Flags

AI NFTs will soon flood the market, but the vast majority are not going to succeed.

AccessTimeIconDec 10, 2021 at 10:18 p.m. UTC
Updated May 11, 2023 at 6:06 p.m. UTC

NFTs are on a clear collision course with artificial intelligence, which will in theory unlock all sorts of new utilities, including AI-driven avatars that can hold conversations.

However, because few people understand how this type of AI works, it’s alarmingly easy to hoodwink buyers. Recently, a handful of so-called smart or intelligent non-fungible token projects have begun minting digital nothingburgers, marketing them as groundbreaking “AI,” gaining endorsements from influential investors and breaking resale records. Soon, the market will be flooded with “AI NFTs,” the vast majority of which are absolutely, positively, NGMI (not going to make it).

Lauren Kunze is the CEO of Pandorabots, a leading chatbot platform that powers conversational AI applications for developers and global brands, and its subsidiary ICONIQ, which creates embodied AI like Kuki, a socialbot that has exchanged over a billion messages with 25 million human fans.

Here are five red flags to watch for:

1) It’s a cheap deepfake.

If you’ve yet to see a deepfake-powered moving portrait, the first time feels like some serious Harry Potter magic come to life. But this seeming wizardry is actually quite accessible to mere muggles through numerous free apps (e.g., TokkingHeads, MyHeritage, Wombo) that let you turn any image into deepfake content based on a source video. So if you are creating or already own NFTs and you want to make them move, you can do it yourself on the cheap. Anything beyond minimal movement like blinking or breathing may break the illusion due to inevitable incongruities between the image and its underlying source video. Thus, if you see a moving NFT that isn’t moving much, it’s almost certainly using off-the-shelf deepfake software, and not some special secret AI sauce.

These cheap deepfakes are being used to advertise things like portraits of historical figures which, when plugged into GPT-3, could converse with you in real time. The thing is, running a real-time conversation with an AI requires a 3D CG model. So if you stumble upon a deepfake advertised as capable of conversing in realtime, alarm bells should sound because deepfakes don’t really run autonomously. Rather, they rely on superimposing someone’s face on an actor in a video during post production (for example, Zuck on Trey Parker or Tom Cruise on a talented TikTok impersonator), or during a video call (say, Deep Elon Zoom-bombing using free software like Avatarify).

2) It uses GPT-3 or has other third-party dependencies.

The Generative Pre-trained Transformer 3 (GPT-3) language model is amazing at generating words and images in the style of a specific person or genre given enough examples. Giant, generative language models – apart from being slow to reply and so expensive to train that GPT-3, trained pre-COVID, remains unaware the virus exists – are trained on the public internet, which is a bit akin to drinking from a public toilet. Human beings are also typically toilet-mouthed monsters when talking to software (and each other) on the internet, but even the human players of a GPT-3-powered role play game were disturbed when the algorithm inserted children into sexual scenarios. That is why OpenAI takes great care to highly restrict chatbot use cases – and why any NFT project purporting to use GPT-3, especially to power add-on “personalities,” should raise eyebrows.

After years in limited beta with a long waitlist, OpenAI just announced that its GPT-3 API is now generally available to developers. (Until and unless Microsoft – which funded and apparently exclusively licensed GPT-3 – closes “Open” AI.) So far, some upstarts with early access have been using GPT-3 to create and sell unique “personalities” for your NFT. But now that the API is truly open to all, the reality is that anyone can generate a GPT-3 powered “personality” by feeding it a few example texts. Whether OpenAI allows your particular use case to remain live is another story. Last summer, the Chronicle published a heartbreaking tale about a man who used a GPT-3 powered service to bring his dead fiancé back to life based on their texts. Months later, in a heart-rending twist, OpenAI shuttered the service without warning for violating its stringent terms; Joshua’s fiancé Jessica essentially died twice. GPT-3 NFTs are liable to meet the same fate.NFTs are ultimately about verified ownership. If your chatty “AI” NFT depends on third-party services that the seller neither owns, develops, nor controls, what exactly do you actually own?

3) AI is “on the roadmap” (and all over the marketing materials).

While GPT-3 usage or any third-party dependency is reason for serious pause, “AI” in a project’s marketing materials or roadmap without any evidence of its existence in the actual product is a red flag. AI is not an ingredient anyone can simply add into the mix provided more resources and more time. Still, given the recent buzz surrounding some AI NFT projects, people are now prefacing projects with “AI” almost as fast as they are appending “metaverse” to everything.

4) The team has no AI background

When in doubt, a project’s team page is typically illuminating as to whether it has a prayer of accomplishing an ambitious AI roadmap. Zero AI expertise = huge red flag.

Even teams with AI know-how often struggle to deliver. For example, Little Sophia, a miniature of the uncanny humanoid Sophia Robot that has raised multiple rounds of crowdfunding since 2019, with a perpetually retreating ship date (December 2022 as of this writing) and increasingly irate, disillusioned backers. Sophia, who made headlines in May for selling an NFT that erroneously claimed to be the first digital art by a non-human, and whose creators have stated is “basically alive,” is a big part of the AI hype problem. (The NFT auctioned for $688,888 was in fact made in collaboration with a real human artist, and is decades-predated by AI digital artists like Harold Cohen’s Aaron).

Sadly, when it comes to the general public, these marketing gimmicks, including granting Sophia Saudi Arabian citizenship, actually seem to work. In 2017, an offshoot of the team behind Sophia raised $36 million in less than a minute in an Initial Coin Offering to build “Artificial General Intelligence” or decentralized “AGI” on the blockchain. Last I checked, they – along with the tech giants and top AI researchers – have yet to solve AGI. The moral of the story: even if the team has AI experience, they may still fall short, and if their primary background is in initial coin offerings, marketing, or other get-rich-quick schemes, it may be time to run for the hills.

5) A rose by another name smells … like a tulip.

Speaking of marketing, that guy who invented the analog pet rock was a genius. (Also a millionaire, after selling over a million pet rocks at four bucks a pop.) The rocks were just rocks, with a cute pun-filled care and feeding manual (spoiler alert: they required none) packaged in a cardboard box. But calling them a “pet” sparked a fad on par with Beanie Babies or, dare I say, Bored Apes. To wit, the “first iNFT, “which sold at auction at Sotheby’s for nearly half a million dollars, is in fact just a chatbot attached to an avatar: technology that, while less ancient than rocks, has been around since the 1960s. Which is perfectly fine, so long as buyers know that what they are buying is, in fact, just an embodied chatbot by another name.

Regardless of the packaging, chatbot powered avatars – ahem – AI NFTs are a promising “new” category (much like the metaverse is having a “second life”). NFTs stand to revolutionize digital ownership and put money where it rightfully belongs: back in the hands of creators. But even if the industry overcomes other hurdles like its high carbon footprint, we are only going to make it to mainstream adoption if we collectively call out scams and noise so that quality work can shine.


Please note that our privacy policy, terms of use, cookies, and do not sell my personal information has been updated.

CoinDesk is an award-winning media outlet that covers the cryptocurrency industry. Its journalists abide by a strict set of editorial policies. In November 2023, CoinDesk was acquired by the Bullish group, owner of Bullish, a regulated, digital assets exchange. The Bullish group is majority-owned by; both companies have interests in a variety of blockchain and digital asset businesses and significant holdings of digital assets, including bitcoin. CoinDesk operates as an independent subsidiary with an editorial committee to protect journalistic independence. CoinDesk employees, including journalists, may receive options in the Bullish group as part of their compensation.

Lauren  Kunze

Lauren Kunze is the CEO of Pandorabots, a leading chatbot platform that powers conversational AI applications for developers and global brands, and its subsidiary ICONIQ, which creates embodied AI like Kuki: a socialbot that has exchanged over a billion messages with 25 million human fans.

Read more about