Chris Baumbauer: Personal Musings

Blogs

Ordered list of blogs will go here with a widget

World Changing Moments

Posted: May 8, 2023 5:39 pm


As a follower of Jonathan Stark for a while with his daily emails, today's email touched a nerve. It mainly looked at Steve Job's take on the power of disruption and catching the next big thing. The correlation Jonathan mentioned ties into both the mobile revolution of roughly 17 years ago and the AI revolution that OpenAI and ChatGPT ushered in about 6 months ago.

While some may argue that crypto, blockchain, and web3 were supposed to be the next big thing, it never moved beyond the hype. Perhaps it was due to the complexity of the initial design, the description not quite jiving with reality, the massive infrastructure investment, or the fraud. While I did have my own web3 blog post half written, I never published it as I didn't want to jump on the me-too aspect of things.

However, what I'd really like to do today is look at other facets of world changing moments such as what we're witnessing now, and had seen with social media and mobile devices before. It is the question of do we build first to see if we can do it, monetize it (or find ways to monetize it), and then deal with the consequences later, or do we think of the concept, and do a hard look at the ramifications before building and monetizing it?

There is something very seductive about looking into the unknown and making the impossible possible. Being the first to market also brings things like name recognition and dominance in a very new space. One of my biggest criticisms of the bay focuses specifically on this mentality under the guise of changing the world with an app, but having the narrow short-term focus of market share or profits while dealing with the consequences later. After all, there is a reason the mantra of Facebook is to move fast and break things. While the bay strives for an altruistic goal with the projects and products that are created (such as connecting the world), the hard reality and truth of how our mercantile world works means there must always be some way to monetize said invention. From electricity with power generation, and even today's information world. These days, you either: pay for a product (with occasional updates that stop unless you upgrade), pay a subscription (guaranteed constant updates and access), or for free (you become the product by sacrificing who you are and what you do online). Depending on the type of technology and the payment model will dictate the technology's incentive model.

Picking on social media, the primary driver is new user acquisition, and all of the financial metrics for their funding were tied to this. After all, for the platform to be viable, there must be users. To be considered as not just any business, but a profitable business, the company must make money to not only sustain itself, but provide a return for the investment they received to build the product to begin with. By looking at the prior options I cite, the first option is automatically negated because it is closer to a service than a product. The second option could be doable, but given what ended up happening with platforms such as App.Net this is not a sustainable business model. That leaves option 3, which has been the primary model for Facebook, Twitter, TikTok, etc. For this, they can raise funds in two ways: 1. Guaranteed eyeballs to force-inject ads either on the side or into the content stream 2. Provide curated access to the user-base to third parties for their own use.

This is why social media is considered an ad platform where they can provide ads into the user's feed to provide one revenue stream (based on the user's prior engagements), and they can sell your information for pennies on the dollar. Thus it is in their best interest to keep you hooked on the platform by making it easy to create and engage, promoting controversial topics, and making it incredibly painful to leave. Bear in mind, in the beginning it may have started off as a way to keep your friends in the loop (Facebook), share cat photos (Twitter), or where you're at (Foursquare).

While much of this is clear now, at the time it really wasn't. We took the platform at face value, contributed to it, and didn't really think about the ramifications from addiction to ruined relationships, and living in a filter bubble so strong that we have a separate concept of truth from our peers and can no longer understand them.

So what does this have to do with artificial intelligence, and the recent hot topic of OpenAI and ChatGPT? Unlike most of the other movements over the past few decades, AI has been a constant "always 10 years into the future" type of concept. While yes there was Watson from IBM and more recently facial recognition. They all seemed like impressive and yet small improvements to our day to day life. In a sense, we've all been hoping for the Star Trek computer that is voice activated and able to execute commands and provide encyclopedic knowledge for ages, and now it is here. OpenAI released it to the world, and we are still coming to terms with it's power and novelty.

While I do applaud the push for a pause to evaluate the full ramifications of this phase of AI, it is voluntary as it lacks the full authority of a government agency or trade group. Secondly, they touch solely on safety protocols to ensure minimal impact on humans. What they lack is the second order or third order consequences from their introduction let alone the underlying profit motivations of the companies introducing them. Without either of those, we will find ourselves in a situation as bad as, if not worse, than the introduction of social media where we cannot tell what is true and based on reality from fiction that was imagined by a third-party, or based on data that was the result of institutional injustice that is carried forward to train the underlying models.

That is what made Gefforey Hinton's interview in the NYTimes so impactful. He left the US to continue his research in Canada due to the influence of his continued research being funded through DARPA, and then left Google. While yes there will be jobs impacted, but I suspect we are still not asking the right questions when it comes to understanding the full ramifications of what is being unleashed. Just like the crypto world, I suspect we as a society still don't have a full understanding of how our technology works to be able to navigate a path forward where it can act as a tool that works on our behalf instead of us working for it.

Topics: AI, philosophy, tech,


Return home