OpenAI, Humane and Apple's Shadow
The future of consumer AI hardware is coming into focus. Who will win?
Thank you for subscribing to SatPost.
Today, we will talk about OpenAI (developer conference), Humane (AI hardware) and Apple’s AI efforts.
Also this week,
Hardest language to learn
World’s biggest hedge fund
…and them fire memes (including fast food zen)
OpenAI hosted its first developer day last week.
The leading AI startup announced its new GPT-4 Turbo model, which has a 128k context window (in layman’s terms: users can input up to a 300-page book and prompt the massive text to write their college essay).
GPT-4 Turbo is also faster and cheaper than the previous model that powered ChatGPT.
Another significant launch was the GPT Builder, allowing users to create personalized AI agents (e.g., a paleo Vietnamese food recipe tool) using natural language. There will be a marketplace where users can list their agents, called GPTs, for sale (yes, SatPost readers will get a discount on my paleo Vietnamese recipe tool upon its release).
A viral clip from the presentation featured OpenAI CEO Sam Altman creating a GPT that offers startup advice based on his writings. The demo only took 4 minutes and required no coding.
Combined with audio and visual AI tools — I saw one demo where someone snapped a photo of a landing page and had code spit out for how to recreate it — the possibilities for a GPT Builder feel endless.
Some heralded the keynote as Altman's “Steve Jobs moment”.
The most salient example for this comparison was when Altman received a calendar notification during another GPT agent demo. He then held up his phone and even nailed the 83° arm bend that Jobs did when he unveiled the iPhone in January 2007.
In a thorough review of the OpenAI keynote, Ben Thompson was impressed by the GPT agents but believes that ChatGPT — which is still just an app (albeit one that is swallowing up other traditional apps) — is the wrong AI interface.
“AI is truly something new and revolutionary and capable of being something more than just a homework aid, but I don’t think the existing interfaces are the right ones,” Thompson highlighted from a previous article he wrote on the topic. “Talking to ChatGPT is better than typing, but I still have to launch the app and set the mode; vision is an amazing capability, but it requires even more intent and friction to invoke.”
When it comes to technology interfaces, no one has been better than Apple. Although its generative AI offerings are currently slim and Siri somehow keeps messing up my alarm clock times, the iPhone maker has all the pieces to create the best AI interface.
To understand why, let’s look at:
Lessons from Steve Jobs keynotes
Humane’s AI Pin
The state of Apple AI
Lessons from Steve Jobs keynotes
Let’s rewind to 2008: back then, Altman presented his social app Loopt in front of Steve Jobs at Apple’s Worldwide Developer Conference (WWDC). The Apple founder said the product was “cool”.
Reflecting on the event years later, Altman — who idolized Jobs — told CNN that it was the only time he had been “frozen out of nervousness in any business context.”
One lesson he drew from the Apple co-founder was the psychology of how people want to interact with technology.
During his appearance on the Lex Fridman podcast, Altman shared an anecdote about Apple’s colorful and translucent iMac desktop:
“A story that has always stuck with me [and] I don't know if it's true. I hope it is true.
[The story is] that the reason Steve Jobs put that handle on the back of the first iMac was [because] you should never trust a computer you couldn't throw out a window.
Of course, not that many people actually throw their computers out a window, but it's sort of nice to know that you can. And it's nice to know that this is a tool very much in my control. And this is a tool that does things to help me."
I did a bit of Googling and found that the “throw out of a window” quote is actually attributed to Apple’s other co-founder, Steve Wozniak. However, there doesn’t seem to be any evidence that Wozniak even said that.
While Altman's referenced story may be apocryphal, the main lesson is still valid: the user’s comfort level with a computing interface matters a lot.
Apple’s former design chief, Jony Ive, has his own version of why there was a handle on the first iMac. It doesn’t mention throwing hardware out of windows, but instead, it is about personal comfort levels.
“Back [when we launched iMac in 1998], people weren’t comfortable with technology. If you’re scared of something, then you won’t touch it. I could see my mum being scared to touch it. So I thought, if there’s this handle on it, it makes a relationship possible. It’s approachable. It’s intuitive. It gives you permission to touch. It gives a sense of its deference to you.”
Remember, the iMac is a desktop computer.
Why do you need a handle? To move it 3 feet from one end of your desk to the other? No, the handle is about the comfort factor and forming a connection with technology.
“Deference to you” sounds like another way of saying “I can throw this f**king thing out the window anytime I want”.
As many of you know, Jobs returned to Apple in 1997 — and with the iMac’s success — kicked off the greatest run in corporate history. The Jobs-Ive combo created a series of approachable, intuitive and deference-giving blockbuster consumer products: iPod (2001), iPhone (2007), MacBook Air (2008) and iPad (2010).
I re-watched the keynote unveilings for these products. Jobs and Ive loved tactile designs and emphasized the "fiddle factor", in which users can touch and feel the technology.
Here is a play-by-play of some classic Jobs announcements:
iPod Nano: “Where is this thing? There’s no way Steve is hiding it in his 5th jean pocket. You can’t fit anything in there. Maybe a twenty-dollar bill to grease the bouncer at the nightclub, but nothing else. There’s no way he’s sliding something out of that pocket. There’s no way he’s…HOLY CRAP, IT’S IN HIS 5th JEAN POCKET! THERE’S A F**CKING IPOD NANO IN HIS JEAN POCKET!! AND HE’S HOLDING IT WITH TWO FINGERS!! LOOKS LIKE HE COULD TOSS IT OUT THE WINDOW ON THE OTHER SIDE OF THIS ROOM IF HE HAS TO!
iPad: “What’s that leather lounge chair doing on the stage. Kinda looks like a mini-casting couch. That’s gotta be an uncomfortable place to use a laptop. Looks great for a Whiskey and Diet Coke. Not for a computing device, though. No way Steve sits there. Wait. WHAT?!? STEVE IS USING THE IPAD ON THE LEATHER LOUNGE CHAIR!! THAT IS INCREDIBLY INTUITIVE!!!
MacBook Air: “Wonder why Steve has a yellow envelope on stage. Probably his taxes. Weird to have during a keynote but, yeah, whatever. Wait, he’s twirling that little red string that keeps envelopes closed. Paper isn’t metallic. PAPER ISN’T METAL GREY! HE’S HOLDING A F**KING LAPTOP!!! SO APPROACHABLE!
Since its founding in 1976, Apple products have aimed to reduce computing complexity while getting more and more personal: Mac (graphical user interface and mouse), iMac (handle, unibody), iPod (wheel), iPhone / iPad (multi-touch) and Wearables (Watches and AirPods that sync with everything else and — mostly — “just work”).
With that background, it is unsurprising that Altman is reportedly in talks with Ive — who left Apple in 2019 to form his own consulting firm LoveFrom — to raise $1B+ from Softbank’s Masayoshi Son and build a hardware AI product.
What will they make? What is the correct generative AI interface?
Existing smartphones? Next-gen AR glasses? Brain implants? AI-assistant earpiece like in the film Her?
Humane’s AI Pin
One new AI hardware example dropped last week.
It is called the AI Pin and was created by Humane. The startup was founded by Imran Chaudhri and Bethany Bongiorno, both former veteran Apple designers.
Chaudhri worked with Steve Jobs on all of his post-comeback hits and was part of an Apple duo known as the “Lennon and McCartney of design”. Bongiorno was instrumental in the development of the iPad.
So, what is AI Pin?
It’s a square computing device — packed with a camera and sensors — that you can pin to your shirt. The main computing input is voice and the device can project digital ink onto your palm.
The AI Pin shows how generative AI enables post-smartphone and post-app interfaces, as described by Om Malik:
So far, we have used mobile apps to get what we want, but the next step is to just talk to the machine. Apps, at least for me, are workflows set to do specific tasks. Tidal is a “workflow” to get us music. Calm or Headspace are workflows for getting “meditation content.” In the not-too-distant future, these workflows leave the confines of an app wrapper and become executables where our natural language will act as a scripting language for the machines to create highly personalized services (or apps) and is offered to us as an experience. […]
The way I see it, the evolution of apps to “experiences” means that we are seeing the end of the line for the App Store as we know it.
“It’s not about declaring app stores obsolete; it’s about moving forward because we have the capability for new ways,” Chaudhri argued.
Humane’s idea is to make these workflows (aka apps in smartphone terms) available to us through its myriad interfaces — primarily voice.
Another term tossed around during Humane’s launch was “ambient intelligence”, as in the AI is just kind of there when you need it.
The AI Pin is an interesting idea but the 10-minute demo video did not leave me thinking “I need this”, especially at the current price of $699 (+ $24 a month for text/talk/data).
Avi Schiffmann — who works on AI wearables — noted design problems with Humane in relation to human psychology:
New form-factor: The iPhone, AirPod and Watch were all products that fell into established categories where people immediately knew how to use them. The AI Pin requires completely new behaviour.
Socially acceptable? People don't wear pins in everyday life (I think). Also, the Pin has privacy concerns with its audio and visual inputs (the camera has a "trust" feature — a green light indicates it's on and a red light indicates it's off — which looks a bit creepy). Speaking out loud to an inanimate objects in public also seems like a mental leap.
Decision fatigue: The daily exercise of "where do I stick this thing on my shirt" is another thing to worry about. (I can barely figure out which socks to wear each morning. This is why Jobs only wore black turtlenecks and Levis jeans with a fifth pocket that could fit iPod Nanos.)
These issues do not make the AI Pin sound very “approachable”, “intuitive” or “deference-giving”. One alternative that Schiffmann offers is a necklace form factor. I do not have strong feelings about necklaces, but at least people do wear them and know how to put them on the same way every time.
Another hiccup with the Humane launch is its competitive positioning.
I like the idea of not carrying dopamine-dripping smartphones everywhere and my solution is much cheaper than $699: I started using a Kale Phone and a Cocaine Phone.
Who is Humane actually for?
It is obviously unfair to compare most things to the iPhone as the device is literally the greatest product in the history of capitalism with 2B+ units sold and $1T+ in lifetime sales.
But it is worth noting that this crystal clear 2x2 matrix was used by Steve Jobs in the iPhone launch keynote.
The axes on the matrix were “smart / not so smart” and “hard to use / easy to use”. In the least shocking corporate decision ever, Jobs slotted the iPhone in the upper right corner.
The iPad also had a great positioning slide, which highlighted what Jobs thought the tablet form factor did better than a smartphone or laptop. Even though the iPad didn't fulfill the vision, it was at least clear how the tablet could substitute or augment existing devices.
The state of Apple AI
Unlike the iPhone and iPad slides, the AI Pin’s demo left more questions than answers.
After the Humane launch, any future AI hardware offering has to answer this question: If someone is spending up to $699 on an “ambient intelligent” device in addition to a smartphone — because this is not enough to replace a smartphone — then why wouldn’t they just wait for Apple’s version?
I imagine that is the calculation that the 1B+ people who already own an iPhone will make. Out of those, 200m+ have bought a Watch and 400m+ have bought AirPods (I am responsible for 10% of these purchases because I keep losing the left bud).
There are valid questions as to whether Apple actually wants to live in a post-App world. Multiple sources familiar with the matter tell me that the iPhone and App Store combo makes quite a bit of money.
Tim Cook & Co. have been complacent with generative AI since the launch of ChatGPT. According to Bloomberg's Mark Gurman, the company has been caught "flatfooted" and their “only significant AI release” in the past year was an improved auto-correct system.
However, the market is forcing Apple into action and it is committing at least $1B a year — which is a drop in the bucket — to spread generative AI across its products:
Apple GPT: Apple has its own large language model (LLM) called Ajax and could release it soon.
iOS: Ajax will improve auto-complete and suggestions within Siri and the Messaging app.
Xcode: Integrating AI into the platform for Apple-product app developers (similar to Microsoft with GitHub Copilot).
AI across consumer products including Apple Music (auto-generated playlists like Spotify) and the productivity suite (writing tools for Pages, auto-generated slides for Keynote)
For those who are wondering “Trung, what about Google, Android and its various AI projects?”
Yes, they are in the mix (on a side note: my 2010 Samsung Galaxy Note with the stylus was so baller).
Meta has an open-source LLM (LLAMA) and wearables hardware (Quest, Meta Raybans). Another under-the-radar contender could be a cross-pollination of Elon's companies, including Teslabot/xAI/X/Neuralink.
However, Apple has always been the best at integrating hardware and software
Last month, Microsoft CEO Satya Nadella — who has invested $10B+ into OpenAI and is the startup’s major strategic partner — said one of his biggest regrets was shutting down Windows Phone and leaving the mobile market. Nadella says it was a missed opportunity to define the next-generation of computing, which almost certainly includes personalized AI-first devices.
And do you know what personalized AI-first devices need? Powerful chips that are energy efficient and can optimize compute, power and memory trade-offs.
Kind of like Apple’s line of custom chips.
Case in point: the new M3 chip that powers Mac an MacBook. While the M3 is designed for desktops and laptops, its performance— which Apple claims is 3-4 years ahead of competitors on key metrics — will find its way into A-series (iPhone, iPad), H-series (headphones) and S-series (Watch) chips.
Here is another article from Om Malick on the M3 chip and Apple’s silicon efforts:
AI algorithms function with extreme parallelism. While adding more compute power (and GPUs) can address this, the real challenge lies in how quickly data can be moved, how promptly and extensively memory can be accessed, and the amount of energy required to operate these algorithms efficiently. Apple’s strategy with its Silicon has been remarkably prescient, taking into account these realities even in their latest GPU updates. […]
Apple has a substantial opportunity to integrate generative AI into its core platform, mainly because of its chip and hardware-level integration. For example, by actively incorporating open-source generative AI models into their SDK and developer tools, Apple can leverage the evolving nature of the interaction between humans and machines in the digital world.
Apple’s silicon gives the company options for powerful on-device processing, which offers more privacy than running everything through the cloud. Privacy will be important for any “ambient intelligent” device and Apple seems to have the edge on this front.
Final Thoughts
I’ll wrap with one final Altman and Jobs story. In 2009, Paul Graham — the founder of Y Combinator — wrote a blog post about the “5 most interesting startup founders of the past 30 years”.
Steve Jobs was #1.
TJ Rodgers was #2.
Larry Page and Sergey Brin were a combo at #3.
Paul Buchheit was #4.
Sam Altman was #5.
At the time, Altman was only 24-years old. He wasn’t yet the President of Y Combinator and was years away from co-founding OpenAI with Elon and other tech hitters.
“Honestly, Sam is, along with Steve Jobs, the founder I refer to most when I'm advising startups,” Graham writes of the must-have-been-shocking-at-the-time Altman choice. “On questions of design, I ask ‘What would Steve do?’ but on questions of strategy or ambition I ask ‘What would Sama do?’”
The framing kind of reminds me of when Netflix created their own original content and its now co-CEO Ted Sarandos stated that “the goal is to become HBO faster than HBO can become us.”
OpenAI has the current lead. But it has to figure out design and hardware — manufacturing and shipping to 1B+ users — before Apple figures out its AI strategy and ambition.
Even post-Jobs and Ive, Apple has the easier of the two paths.
Caveat: everything I’ve written is moot if/when someone cracks artificial general intelligence (AGI). In the meantime, Apple needs to make some AI for its AirPods to make sure this never happens to me again.
Links and Memes
What’s the hardest language to learn? The US State Department places languages into four tiers based on how long it is takes to learn: from 24 to 88 weeks. Per The Economist, difficulty for English-first speakers is determined by:
Different writing system:
the hardest languages don’t use Latin alphabet (eg Mandarin, Japanese)
Sounds that don’t exist in English (“clicks” in African languages; rising/falling tones in Asian languages)
Grammar: Words that have different endings based on their use in a sentence (many Arabic words have different pre-fix and suffix, or vowels and consonants based on sentence placement)
Shared etymology: English/Latin
Author Mark Manson shared an important addendum to my X post on this topic:
These numbers are based on the assumption of full immersion and 40+ hrs/week of tutoring/classes. State Dept language training is actually incredibly rigorous. Picking up that language becomes your full-time job. For casual learners, I'd multiply these numbers by 3-5x.
Based on the fact that I’ve stopped and started Vietnamese on Duolingo over 1000x over the past 7 years, that’s the hardest language for me (State Dept. says it should only be 44 weeks).
Once Apple's "ambient intelligence" product drops, I'll just use the real-time translation feature.
***
Bridgewater is the world's biggest hedge fund with $150B+ in assets under management. It's also likely the most successful hedge fund of all time in terms of absolute dollars returned (~$50B).
The fund's founder Ray Dalio launched the firm in 1975 from his apartment. One of his early wins was figuring out how to help McDonald's hedge the price of chickens so the fast food chain could make Chicken Nuggets. As the fund grew (and grew), Dalio started instituting internal company rules called Principles (which he later published into a best-selling book).
Anyways, a book (The Fund) on Bridgewater just came out and the stories are wild as told in this NY Mag excerpt. One long-time employee told Dalio of his Principles: "You’ve got 375 Principles. Those aren’t principles. Toyota has 14 principles. Amazon has 14 principles. The Bible has ten. Three hundred and seventy-five can’t possibly be principles. They are an instruction manual.”
Someone also once pissed on the floor of a urinal and Dalio apparently had employees monitored for weeks to find out who did it.
Dalio wrote a short post on Linkedin saying the book is written (and informed) by disgruntled employees and that the only proof anyone needs is that clients have stuck with the fund for years (even during periods of underperformance).
***
And them other baller links:
**Podcast Alert**: I joined Ryan Whitney, Paul Bissonette and Rear Admiral on Spittin' Chiclets, the world's #1 hockey podcast. It was an absolute blast talking about SBF, WeWork, AI, Costco, NHL vs. NBA and so so so many jokes.
Prenuvo is an MRI scan startup. I know a few people that have tried it but the controversy seems to be that some scans uncover potential ailments that aren't ailments and just place extra stress on customers.
Bearly AI Update: My AI-powered research app is *not* releasing AI hardware (yet). But you can try leading text and image models (including OpenAI updates) to save hours on work. Check it out here and use code SATPOST1 for one month free of the Pro tier.
How I read: Rob Henderson is one of my favourite writers, covering a lot of topics on human psychology. He reads 40-50 books a year. He breaks down his process and says "There’s no secret. I read pretty slowly. I take notes, I underline, I highlight, I jot my thoughts in the margins, I pause if I encounter an especially interesting passage or idea."
…and here them fire tweets / X posts:
Finally, Amazon Web Services in Netherlands converted a former prison into a work office. People are literally working in jail cells and the puns in this TikTok are gold lol:
As a tech enthusiast, I was naturally drawn to the Humane device.
However, after some research, I’ve realized it doesn’t offer much more than my Apple Watch. Considering the Apple Watch already covers many of the functions of my iPhone, minus the photo-taking ability, I think I’ll wait and see how the Humane device evolves.
It’s definitely going to be interesting to see where this goes!
World class article .lessons learnt bravo