This is "The AI Economy," a weekly LinkedIn-first newsletter about AI's influence on business, work, society and tech and written by Ken Yeung. Sign up here.
Welcome to another edition of “The AI Economy,” a newsletter curating the week’s most relevant AI news impacting business, work, society and technology.
In this issue:
1️⃣ OpenAI launches the GPT Store and a premium plan for small teams
2️⃣ AI was everywhere at CES 2024
3️⃣ An A-to-Z guide to understanding AI buzzwords
Today’s Prompt
OpenAI launched its app store on Wednesday, following its announcement last November. However, access to the marketplace, known as GPT Store, is limited to paid subscribers, including ChatGPT Plus, Team, and Enterprise users. The company reports having generated over 3 million custom versions of ChatGPT so far.
Drawing parallels to Apple’s App Store, the GPT Store marks a significant development in Generative AI. OpenAI’s COO Brad Lightcap believes their marketplace will surpass the impact of Apple’s App Store.
It’s a lot of work to build a single iOS app. If you look at what GPTs are, that can be spun up quickly, configured quickly. We’ll make it easier and easier for them to have accessibility into other services in other parts of your workflow. Some of them will be fun; some of them will be for work. So you get a lot of diversity there. I expect there can be many thousands of GPTs that any individual person ever comes in contact with specific to the things they want to do.
At launch, users can pick from various GPTs across a host of categories such as writing, research and lifestyle. Some of the chatbots in the marketplace include those from hiking guide AllTrails, Canva, Consensus and Khan Academy. And there’s a low barrier to building for the GPT Store: No coding skills are required.
But don’t plan on making money from your GPT, at least right away. OpenAI says a revenue-sharing program will be launched sometime in Q1 2024 and payouts (for U.S. builders) will be based on GPT user engagement.
A Subscription Plan for SMBs
To increase the visibility of GPTs, the company introduced ChatGPT Team, a premium plan designed for teams of up to 149 individuals. Priced at $30 per user per month or $25 per user annually, this option might attract businesses, especially small and medium-sized ones, who find the cost of an enterprise plan too high. OpenAI has now catered to the needs of the entire spectrum.
Don’t Share Sensitive Information
Though the comparison to the App Store is fitting, it’s also similar to extensions used on Google Docs or Microsoft Office. Working professionals will turn to these GPTs to help automate daily tasks, expedite financial processing, handle customer support, plan campaigns and more. Even so, they must remember to watch out for hallucinations generated by GPTs and to avoid providing sensitive information. It’s critical to know where that information is going.
And speaking about Office, considering Microsoft’s significant investment, is it possible that some of these third-party GPTs might become accessible as extensions for Microsoft Copilot in the future?
All Your Data Trains ChatGPT
In the grand scheme of things, the interactions with GPTs will likely be used to train OpenAI’s LLM, shaping the next ChatGPT. This ongoing process not only improves ChatGPT but also cooks up new features for third-party developers to jazz up their GPTs, keeping the innovation wheel turning. Plus, it’s a clever move by OpenAI to snatch more market share from its competitors.
Like any platform out there, OpenAI understands it can’t cater to everyone’s needs. The GPT marketplace is their way of casting a wider net and figuring out new ways these cool language models can be put to use. As they dig deeper into this exploration, they might find some specific scenarios that deserve a bit more attention.
Share Your Experience
What excites you the most about the GPT Store? Leave a comment or send me a message with your thoughts. Also, if you’ve built a GPT, share your experience and how you envision individuals, teams and companies using it.
A Closer Look
The 2024 Consumer Electronics Show (CES) wrapped up this week and as predicted, AI was a major theme. It was infused into most announcements, from having ChatGPT-powered integrations and new chipsets to AI PCs and a buzzy mobile device called Rabbit.
If it wasn’t obvious enough, Samsung’s theme for the show was “AI for All.” Beyond the new appliances and televisions it unveiled at CES, the company touted updates to its smart home platform SmartThings and AI assistant Bixby (yes, it still exists). Just like my analysis in 2017, Samsung is emphasizing the importance of these services in assisting consumers in managing their AI ecosystem at home.
Expect to hear a bit more about this theme on January 17 at Samsung Unpacked when the newest Galaxy smartphones will be announced.
Non-tech brands also got into the AI act such as Mastercard which debuted a tool providing personalized help with starting a small business; Walmart which says it’ll launch a chatbot and use drones to enhance customers’ shopping experience; and L’Oreal which demonstrated an AI-powered beauty advisor.
ChatGPT had at least one moment in the spotlight. Volkswagen says it’s bringing the Gen AI platform to its electric vehicles, allowing drivers to engage their car or SUV in conversation. Even still, though it wasn’t directly mentioned in many announcements, its influence on CES can’t be understated. It, along with OpenAI, has paved the way for Gen AI’s integration into consumer electronics and how we use technology today.
Jumping over to hardware, what would CES be without computers and laptops? Unlike in past years, foldable screens didn’t dominate the show. Instead, it was AI PCs. These are PCs powered by chips containing dedicated Neural Processor Units that can handle AI tasks locally. Two chipmakers, Nvidia and Intel, revealed efforts to support these devices, though the latter also has plans to bring AI PCs to the car. By having AI locally, you’d receive responses faster, won’t need internet access, and possibly even have your usage remain separate from the general public.
But perhaps the one tech that had people talking was the Rabbit R1, a palm-sized, AI-powered device functioning as your smart personal assistant. Priced at $199, it’s part of a growing trend to either replace (think Humane AI Pin) or supplement the smartphone. Rabbit generated so much buzz that it quickly sold out its pre-orders.
It doesn’t use a traditional operating system and doesn’t run any apps. However, you can connect the R1 to your existing apps and have it call an Uber for you, play a Spotify playlist, or do anything more complex such as booking a vacation or ordering food. It’s powered using both Rabbit’s LLM and OpenAI’s ChatGPT-4 model.
But beyond the hype, some wonder if Rabbit could be just vaporware. Can it live up to its promise, especially with handling tasks that are more nuanced than ordering an Uber or automating lighting at home?
In the end, what does all this AI news mean? What should we take away from these innovations? I like the take CNET’s Editor-at-Large Connie Guglielmo has:
1) The AI trend in consumer devices will result in products that can anticipate our needs and take action, saving us time and effort, and delivering better results than we might attain for ourselves.
2) It’s easy to slap AI on a product, but it’s harder to explain the problem it’s solving. Guglielmo states makers need to show the benefits of AI, beyond chatbots, for why the technology is useful and needed.
3) The current “smart” devices we have “are still pretty dumb.” Guglielmo suggests we look beyond the devices and focus on the software powering the AI experiences.
Remember This
“Right now, [AI’s] inaccuracies are providing humanity with some breathing room in the transition to coexistence with superintelligent AI entities. Because we can’t trust LLMs to be correct, we still must do the work of fact-checking them. This keeps us in touch with reality, at least until we turn the whole shebang over to GPTs.”
— WIRED’s Steven Levy on why he values AI hallucinations
Can’t Miss Event
Curious about how OpenAI works with startups? Join the Lynx Collective on January 19 to learn more. OpenAI’s Startups Team lead Marc Manara sits down for a 30-minute conversation to discuss how his company chooses which entrepreneurs to work with and the ones it admits into OpenAI’s Converge program.
Tickets for this virtual event are free, but you must RSVP to attend.
Neural Nuggets
🏭 Industry Insights
Your A-to-Z guide to understanding AI buzzwords (LinkedIn)
Assistive technology is AI’s next billion-person market (Axios)
How AI replaced the metaverse as Zuckerberg’s top priority (Bloomberg)
AI startups are booming in San Francisco’s ‘Cerebral Valley’ (Le Monde)
How UAE Emirates is going all-in on AI for its Tour de France fight (Outside Magazine)
🤖 Machine Learning
Microsoft’s Phi-2 LLM goes open source with claims it’s better than Google’s Gemini Nano (The Decoder)
Why Writer’s Palmyra LLM is the AI model for the enterprise (VentureBeat)
✏️ Generative AI
Amazon’s flawed GenAI chatbot was the result of a “rushed” launch, insiders say (Business Insider)
GenAI has a visual plagiarism problem (IEEE Spectrum)
How Snapchat+ is the first GenAI product after ChatGPT (Big Technology)
🛒 Commerce
Walmart tests AI to improve shopping experience, expanding drone delivery, launching GenAI search tool, and using “scan and go” tech (Associated Press)
⚙️ Hardware
Nvidia reportedly sees pushback on downgraded chips from Chinese customers (The Wall Street Journal)
💰 Funding
AI startups raised $50 billion in 2023, but things might be different in 2024 (Crunchbase News)
Inside Anthropic’s unusual $750 million funding round (Forbes)
Former Twitter CEO Parag Agrawal raises $30 million from Khosla Ventures for an unnamed AI startup to build software for LLM developers (The Information)
⚖️ Copyright and Regulatory Issues
OpenAI responds to New York Times lawsuit, says ‘regurgitation’ of content is a ‘rare bug’ (CNBC)
💥 Disruption
AI-driven drug discovery set to boom in 2024 (VentureBeat)
Mayo Clinic partners with Cerebras Systems to develop AI for health care (Reuters)
AI threatens to push human fashion models out of the picture (Bloomberg)
Previewing what AI will produce during the next decade and beyond (ZDNet)
🔎 Opinions and Research
The on-device AI era is closer than it seems (Supervised)
Building a more human internet with AI (Sabrina Ellis, Pinterest)
🎧 Podcasts
Nancy Wang: AI can unlock opportunities for underrepresented groups in tech (Geekwire)
AI and data best practices with West Valley AI founder Karin Golde (The AI Artifacts Podcast)
OpenAI’s make or break lawsuit and the golden idol of AGI (Motherboard)
End Output
I hope you enjoyed diving into the latest articles on “The AI Economy!”
I’m eager to hear your thoughts on this edition. What struck a chord with you, and what left you scratching your head? Leave a comment or shoot me a message on LinkedIn with your feedback — it’s the secret sauce that makes this journey worthwhile.
Missed any articles this week? I know staying up-to-date on all the AI news can feel overwhelming. Fret not; I’m curating the big stories in my Flipboard Magazine, “The AI Economy.”
Connect with me on LinkedIn and check out my blog to read more insights and thoughts on business and technology.
Got a story you think would be a great fit for “The AI Economy“? Awesome! Shoot me a message – I’m all ears for your pitches. Let’s chat, share ideas, and better understand the AI landscape together!
Thanks for reading and be sure to subscribe to receive future editions.
Until next week, stay curious!
Subscribe to “The AI Economy”
New issues published on Fridays, exclusively on LinkedIn
Leave a Reply
You must be logged in to post a comment.