The ChatGPT Bubble Is Here: $300K For ‘AI Whisperers’
Earlier this week, I lamented the somewhat terrifying prospect of 300 million lost full-time jobs globally to the miracle of generative AI.
That rough estimate came from Goldman, whose Joseph Briggs said "one-fourth of current work tasks could be automated by AI in the US"+.
For context, global layoffs since October sum to around half a million+. That counts as the most acute bout of job cuts since 2009. The "full-potential" AI replacement figure floated by Goldman would thus constitute a veri
As a side-note to the above, I recently used ChatGPT in order to obtain information on a software development problem. The particular problem to solve was “deep in the weeds” of an area for which there is little information available via the usual search websites. ChatGPT responded with a very detailed description (and code). Previously, on a similar problem, I had spent hours, literally, scouring the internet for similar information.
To put it simply, I was amazed. And I literally build AI programs — but not in the neural network area of AI like ChatGPT.
I’ve heard from an Oracle employee that Oracle management has forbidden the use of ChatGPT for software – for now – due to lack of clarification on how copyright law will work. (Yes, software can be copyrighted, although in this particular case the employee works on software products intended only for other Oracle employees.)
I had a similar experience with YouChat. I prompted it for a pretty strange function that was not likely to appear in existing examples on the web: a recursive function to replace every occurrence of the letter “a” in any string with it’s occurrence number (for example, the string “aaabbbaca” would be rendered as “123bbb4c5”.) I specified that it had to be written in the language of a not very widely known platform.
It did take several iterations of the prompt to get it to understand… you have to be pretty precise. However, on the fifth try, it not only generated a working function, formatted as I requested, but it decided on its own to optimize the code by passing the iterator as a second parameter of the function instead of trying to store it in a global scope… a pretty common programming technique on this platform, and not elegant or strictly correct, but definitely simpler to write and understand, and the technique most human programmers would have gone with. And then it specifically called out that it had done this, and reminded me to pass an extra parameter ‘1’ as the initial iterator value when calling the function. And then, the icing on the cake, it cooked up its own example: it concluded by explaining that, for example, functionName(“banana”,1) would return “b1n2n3”. This is exactly what I asked for. When I tried the code myself on the real platform I had requested it be written for, it was bug-free and worked as asked.
I was pretty flabbergasted. It was very much like asking something of a human programmer. Only, much faster.
The possibilities are exciting.
Kudos on the clever punchline. It made me chuckle.
I have both engineering and history degrees. The wordsmiths may inherit the earth.
While there are some very relevant applications I’m in the hype camp
At the very minimum, this will be as big as what AWS did for cloud computing, but on the optimistic end, this could very well be the biggest thing since the internet itself launched. I wouldn’t dismiss this as a fad. Not making a judgement on whether it’ll be good or bad for humanity, but this is much more substance than hype.
Now I’m curious: can you give us some examples of the prompts you use to generate the images at the top of some of your articles?
I’m sorry, I’ve already licensed that information to a prompt engineering firm for $20 million.
(I’m just joking.)
Please don’t say that kind of stuff until you implement little facebook-style “laugh” (etc.) response emoticons.
It’s all about how well the companies can fine-tune the models for their specific use cases. Once you start figuring out the prompt structure, you can start to scale the inputs and the world becomes your oyster. You have an immediate business case with the efficiency gains that you can sell to your customers eager to cut costs. All that vendor spend that was being cut will be reinvested into generative AI capabilities at the expense of people.
I had a somewhat similar experience as brc’s comment where I had a call yesterday with a vendor who showed me how they’ve integrated ChatGPT into their product in a way that created descriptions of the metadata within our system. That kind of documentation is another seemingly simple but hugely valuable use case for many companies who are sitting on years of legacy tech debt that can be very difficult to unpack. Suddenly, it becomes a lot easier to replace an engineer who leaves the company if you’ve got auto-generated documentation. It was far from perfect, but the potential is massive.
Blockchain and NFTs were a joke from the start, but ChatGPT cannot keep up with the demand to integrate with their product. The bubble is merely in its infancy stage right now.
I’ve been using chatGPT to formulate better prompts for another AI doing Stable Diffusion.
I’ve been using it to formulate witty replies to blog post comments.
You can find a fun example searching YT for “GPT-4 Challenges Tesla FSD”. Not a weird bake-off; rather, it’s one of the Tesla FSD beta drivers testing whether GPT-4 could help him come up with a list of waypoints for the next tricky test drive he wanted to push FSD through. Took a few prompt iterations, but he came away pretty impressed with GPT-4’s utility as the handy assistant.
By the way, forgetting GPT, the latest FSD (11.3.3) is pretty spectacularly successful, based on tester consensus. If you’ve not checked out the YT set of FSD Beta 11.3.3 vids, give one or two a look. Engineers (like me) will likely tell you that no one else is even attempting what FSD has already accomplished, all “Beta” snarkiness notwithstanding – but watch and form your own opinions.
The ML stuff has made deep improvements (AlphaFold!) but was less accessible.
GPT is a game changer because it’s applicable now (though it did take at least an invisible decade to get “here”).
Like the Mobile boom where a ton of things we already had got easier and more available: email, taking and sharing digital photos, video calls, online shopping, etc.
And new things made life easier: everyone has an updated map with realtime tracking (which then begat Uber/Lyft)
Considering that reach made Trillions $ (which is easier to calculate than “hours saved from a paper map”) I expect this wave will also make some people very wealthy.
(So paying for a talented prompt engineer might be smart like trying to rope in an early “Mr Beast”)