"It's like clickbait, but for smart people."​

"It's like clickbait, but for smart people."

Issue LXXIII | Commodities-R-Us and the Stewards of Sameness

I'm not yet ready to talk about Succession's return on Sunday night (that much guilty pleasure in one place takes time to process), but the quote in the subject line is, perhaps, my life's work. 😎 If anyone wants to launch a "Substack meets Masterclass meets The Economist meets The New Yorker," I volunteer as tribute.

No alt text provided for this image

More to the point, I still count myself in that class of professional that believes that human's thinking better matters. While there's no shortage of that broad category of "Humans Behaving Stupidly," by and large, most people in your organization offer more in their humanity than they do in their ability to get machines to do things.

For the past decade, businesses who live and die by trust (wealth managers, educational ventures, complex service providers) have been flooded by a religion of technocratic transactionalism where as long was we can fein interest in our marketing, automate connection, and replace relationships with platforms, then all will be well

Looking around, it's not hard to surmise that we've been deceived. We are not, actually, better off by putting efficiency and volume over quality and meaning. Along the way, CEOs have been induced to dilute their businesses into quick fixes, content factories, and bland indistinguishable brands. In businesses like wealth management, the marketing vendors are first in line to contribute to this madness. And now they are armed with an arsenal of AI to speed of up the process (while charging you the same).

This week's Forward is a lesson in the Stewards of Sameness: those trends, places, and people who have a vested interest in you blending in (and how to recognize them coming).

No alt text provided for this image


No alt text provided for this image

Selling trust to pay for risk

Buried in the headlines around Silicon Valley Bank and the subsequent banking crisis is a tale as old as time. SVB and others in the elite banking sector had attempted to expand their revenue model by strapping on investment advisors to their branches, implicating getting access to their high net worth clients to bolster their cash accounts. And now the advisors and their clients are collateral damage.

As a reminder, savings and checking accounts are how banks get "loaned" money so that they can make the real dollars by investing, charging fees, and loaning out money at higher interest rates. 

So customers with large cash balances are as much lenders as they are customers of the bank. And when those customers come via the high-trust business of wealth management, we have an oil and water scenario.

The wealth advice business should produce some of the most customized and personal services in the marketplace, but its often treated as a cash cow for other lines of business or for the founder's lifestyle. 

The fallout of the mid-market banks and their financial advisory arms is just one more cautionary tale of what happens when wealth management isn't understood to be the trust business that it ought to be. It's no wonder why the biggest marketing players in the industry continue to ship services that make RIAs sound, look, and feel as if they are all the same. 

It makes their business faster and more easily automated. I wonder what kind of magical technology might help them with that. 

This, of course, leads me to a question:

No alt text provided for this image
No alt text provided for this image

Is getting stupider really worth it? 

You can't turn on Al Gore's internet without getting blasted in the face with some techno-evangelist (Bill Gates as the latest) telling us how AI is going to change all our lives. We are supposed to be in awe that a computer fed bazillions of lines of data into circuitry and code specifically designed to process and integrate that data is good at... processing it? That a computer built by data engineers on a sophisticated model of data engineering is somehow able to pass Google's skill screen for a $180,000 data engineer?

You'll have to forgive me if I don't stand amazed that a giant web of code is somehow an expert at... code? Large Language Models (like those at the base of ChatGPT and its rivals) have great usefulness from organization existing data, ordering information, even adjusting content to reflect new constraints. (I've used it for all of the above.)

 Human intelligence is at stake. The Digital Age took the dehumanizing principles of the Industrial Revolution and turned them into ones and zeros. It wanted us to believe that we are all just cogs in a machine. And those who've bought into this grand fallacy, are the exact same that seem to believe our colleagues and our own minds are better replaced by the sparkle and shine of algorithmic intelligence. 

Our ability to process reality, make decisions, build relationships, and create economies in no small way depends on our ability to to parse fact from fiction, break apart ideas and put them back together. When we outsource such fundamental human activities to machines, we all pay the price. 

Are there powerful and useful applications to LMMs and other algorithmic intelligence? Definitely. We have AI clients that are doing incredible work to breakdown large bodies of data into actionable information that humans can interpret.

But the rise of ChatGPT is—as new tech always does—bringing religion with it. And in this case it's a secretive one where nearly 50% of YOUR employees are using it for work and most of them aren't telling you. I wonder why?

No alt text provided for this image
No alt text provided for this image

Traitors among us 

No alt text provided for this image

Since trust is on the menu, I couldn't not send you to Peacock's The Traitors for this week's Screen. I've been watching it with my family and we've been enraptured by the campy yet thrilling game show. Structured similar to Survivor or Big Brother, a group of people are sequestered in a Scottish castle and play a high stakes game of (functionally) Mafia where deception, half-truths, and no shortage of fun challenges ensue. 

 

We were litteally screaming at the TV last night as we watched the finale. Give it a go...

No alt text provided for this image
No alt text provided for this image

Daisy Jones and The Six

No alt text provided for this image

I've been holding my breath for over a year for Taylor Jenkins Reid's quick, smart, and thoroughly engrossing book, Daisy Jones and the Sixˆto make its way to TV.

All episodes are available now via Amazon. While the consistently disappointing book-adaption apparatus at Hello Sunshine has taken some of the edge off the tale, the heart of it survives.

The book and series tell the tale of a fictitious Fleetwood Mac-esque 1970s band. The entire cast (including Elvis' granddaughter!) created a completely original album of songs. The best of them blaring on my Sonos speakers as I type this to you. Give it a listen.

No alt text provided for this image
No alt text provided for this image

The most important 100 pages you'll read this year?

No alt text provided for this image

Before you immediately reject a book recommendation written by a philosophy professor... hear me out. 

We now live in a moment where a divided country, one half irate because of misinformation about an election, the other irate about the same regarding a vaccine, have both joyously agreed that we should outsource our thinking to Large Language Models (LLMs) like Chat GPT that have no regard for truth (or lies). On Bullshit should have been required reading in 2016, but its even more critical today. 

No alt text provided for this image
No alt text provided for this image

Tech-forward solutions without the religion

It's never the tech that's the issue. Humans' ability to innovate is nearly insatiable. As I teach our clients: innovation is one of the Six Elements of Creativity that every business has to master.

ChatGPT's capabilities cause me no concern. In fact, like many of you, they fascinate me. What rises in me is the convergence of this technology and our current cultural (in business and otherwise) inability to discern the difference between bullshit, lies, and truth. Take that lack of essential filters and feed it 100x or 1000x more content with no way to validate its truthfulness or usefulness and you have a recipe for disaster. Particularly for businesses that engender trust.

To make the case, we finish with a quote from Harry Frankfurt from the above treatise, On Bullshit:

 

What is wrong with a counterfeit is not what it is like, but how it was made. This points to a similar and fundamental aspect of the essential nature of bullshit: although it is produced without concern with the truth, it need not be false. The bullshitter is faking things. But this does not mean that he necessarily gets them wrong.” (emphasis mine)

Large Language Models and quick-fix marketers all attempt to simulate human connection. Sometimes they get it right, making them logical factories of commoditization. Thus why our discernment is more necessary than ever.

Need to understand the impact of commoditization in your trust business? Schedule some time with me below and let's discuss your options. Stay tuned to our Thursday issue where you'll get first access to our Guide to AI Use in the Business of Trust. We'll address how marketers are using AI and give you sample policies you can use with your marketing team to ensure your business retains its competitive edge.

LET'S CHAT.

No alt text provided for this image

To view or add a comment, sign in

More articles by Nick Richtsmeier

Insights from the community

Others also viewed

Explore topics