The Final Frontier

I agree, it is by choice but only to a limited degree. Choose right now to live without a car. You will make it unnecessarily difficult on yourself to function in modern society. It would be foolish. Hell, at this point it’s difficult to navigate the modern world without a smartphone. The freedom that we lose from technology is that the most basic things associated with your very survival, are no longer under your control. More and more our physical needs are dependent on the system to provide them.

The thing I wonder, for instance the excerpt you posted about the human neurons being grown and integrated with a chip, do the people doing this ever think to themselves “Are we the baddies?”

Exactly, but this is where paradigm kicks in again. If we want pre Industrial Revolution life, we can’t also have modern life. There were no Apple Pay kiosks, no ATMs, no cards, no vehicles, no grocery stores et cetera. Most people didn’t have electricity or even toilets.

I do think there is truth to “plugging in” for needs but it’s still possible to choose a life that doesn’t require it. You just have to give everything up that didn’t exist then to have the freedom of what used to be.

I think that’s the part that remains to be seen. I’m curious to see how this plays out. We do have neuralink and ability to plug in physically, now we have lab attached human neurons and computer disks showing symbiosis and ability to sync.

On one hand, if you could learn as fast as AI I think you would be a baddie. On the other, if AI “takes over” and essentially uses us as its host or fuel source, I suppose we are fertilizer.

It really depends on how it aligns with our biological imperatives at the intersection of “I think, therefore I am.”.

Is there more to life than eat, sleep, and fire of the custard cannon from time to time?

What do you have to do to accomplish those ends?

How you answer those questions has a lot to do with how you’ll perceive the impact of AI.

1 Like

Agreed. If AI stays positive, it potentially opens the door to pursue passions, hobbies and legitimate personal growth.

People who never pursued these to begin with will just get to eliminate the drab of work in their accomplishment of mastering the couch. Especially AI in its current iteration.

1 Like

Adding that creating “artificial general intelligence” verses “artificial intelligence” that we have now is the goal of leading AI developers.

If achieved, this would be the point we jump from an increasingly handy assistant with advanced computational ability to a completely new form of autonomous super intelligence.

True. But to open it up a little, lets consider this:

Cuz as much as it is romanced and edified, hard work kills. It really does.

Don’t get me wrong- I love the sense of accomplishment I get from doing what I do, but I guarantee it has shortened my life and many others in many ways directly and indirectly.

I’m a bit unusual in some ways. AI can perfect welding, but only I can eliminate my job.

A big part of the problem as I understand it in the US at least is that people identify as their work instead of their work being something they do.

Change that paradigm and AI being an existential threat shrinks significantly.

1 Like

I think we are on the same page. If money is automated, it opens different doors of focus essentially.

To reference Maslow again, we will all potentially have the capacity to pursue legitimate self-actualization without the energy suck of necessary work in the way.

But, if there isn’t desire to do so, we will become the ultimate lazy fucks instead. Or there’s a personality driven divergence.

More broadly it would be the first time in history this type of existing was available to everyone from the financially elite to the janitor. And humanity would be redefined. So far we’ve simply gotten better at surviving above an animal level of hand to mouth existence, with cultural development growing (for better or worse) in tandem with freed time and energy.

Or AGI is accomplished, our brains are more efficient power sources for it than the data centers we are building per Cortical and we are the Matrix pods, lol.

1 Like

I think there is a lot of room to speculate psychology overlap with a changing landscape of existence and even root consciousness.

It’s too bad @EmilyQ is quiet these days.

1 Like

Yeah. If peoples brains were cars, hers would be very fast and well appointed.

1 Like

No it’s not the hard work that shortens life, it’s the combination of other stresses in which yes the machine is designed to burn us all out but hard work, especially exercise is a stimulus of new growth which gives a quality of life.

People retire and live a poor quality of life.

But even AI won’t help here, in fact it will actually cause more burnout for plenty of folks because now with AI 1-person has to juggle more shit for the machine (the machine is the system).

This is where I think we divert. I don’t see this happening in a practical sense, and I also think people would actually end up less fulfilled in a world where they have no bearing on their basic needs. I mean depression and suicide rates are higher now than early and pre-industrial society. The human condition is odd in that way.

It’s a fascinating rabbit hole to think about if we actually got to that point.

Just in to say Roko’s Basilisk is real

Real stupid

Which is exactly why the people steering the collective ship fully believe in it

AI can’t even put my porn in alphabetical order reliably. I’m flaccid and unimpressed on every level.

I think this is where @SkyzykS post ties in, if I read it correctly. Essentially people will either lose purpose or find a higher one is how I read it.

I tend to agree. I don’t know that statistics on suicide were super accurate early in the industrial revolution. That would have required significant manual data entry and consolidation via horseback mail sack.

I’m sure losing a sense of purpose would lead to depression and then suicide, but I’m not convinced AI providing for basic needs would end purpose, but shift it.

Out of curiosity, I searched suicide rates among the wealthy. I didn’t do a deep data table dive, but maybe ironically an AI summary shared that categorically as wealth increases, suicide rates decrease, with extremes at opposite ends of the spectrum (super high for very poor, super low for very rich).

One caveat was that when isolating for the mentally ill, wealthy rates actually increased. The theory is that there’s a greater stigma around mental illness and intervention is avoided + shame is multiplied.

My layman thoughts are:

  1. With basic needs met, there is less baseline stress than you’d find in people who work to survive, and the poor are flat out struggling and overwhelmed.
  2. AI, if it goes right in this specific context, will essentially eliminate struggle for everybody. I think the argument could be made that post industrial life has grown complex as you’ve mentioned, however AI is positioned to smooth complexity. It’s already doing so in many areas it touches, but the change it’s forcing hasn’t caught up yet.
1 Like

Yeah. And if you lose purpose that easily, it probably wasn’t much of one in the first place. Like, I’ll find something to do. When I was having a hard time finding a job as an early 20’s dude with a fresh and still growing criminal record, me & a buddy got together and created our own job. A tree & stump biz.

Im not nearly as motivated or energetic as I was then, but if I want to get busy, I will.

If my understanding is correct, one of the primary elements of suicide is a feeling of being a burden. How they come about that can vary.

What I did see as a kid in Pittsburgh that could be analogous was when the steel industry collapsed. That was like a slow motion mass casualty event. My dad took on a second job teaching logic circutry and electronics fundamentals, but a lot of people sat at the bar waiting for the mills to reopen and drank themselves to death.

Not to get too political, but it is what it is:
Another to consider is when the coal industry was shut down through Appalachia. Overdose deaths skyrocketed. Nothing like the President openly declaring war on your way of life to the cheers and sneers of millions to make you feel like a piece of shit.

So with a slower phase in, as AI develops, I don’t think we’ll see the dramatic increases like those, but probably a slow uptick in the average.

Faith and religion used to be a much greater driver of sustaining life too. Like, when we talk about purpose and higher purpose, there’s a big one that has been on a steady downward decline. Like, you worked the fields, but you lived for god.

I think that faith/religion being missing from our society as it has been will probably exacerbate the risks as people feel displaced, disenfranchised, and as though they’ve become a burden upon those around them and society as a whole.

This is something I’ve discussed couple times before, and I feel that majority of human population couldn’t handle the situation. I we look Qatar first example, where part of the population are working only namely and getting full pay (while others are practically slaves). Not surprisingly, depression and passivity has become a problem there. Many just watch Netflix or Social media day after day.

Aristocracy and nobles in history made up a bunch of rules, etiquettes and social responsibilities just to keep people occupied. It’s because human’s aren’t built for passivity, but not all are capable of utilizing the freedom they would get.

If we would move to an utopia where work would not be needed, most people wouldn’t be so self-driven or autonomous that they would figure out enough stuff to improve themselves/society willingly.

That’s why there would need to be an obligation to do something for x numbers of hours each day: learn to play instrument, play some sport, write etc. Something that gives you a purpose.

But I feel this utopia is not even close, current development suggests we’re heading towards tech-lord dystopia. There has already formed small ultra wealthy group which holds the tech and AI in their hands. And current development suggests that most of the people will be poor and useless.

One interesting thing still to mention is that what this current development does to our economic system. Maybe AI will mark the end of free market capitalism? Since the whole concept of supply and demand is changing. If there’s less work there will be less income, if we rely on work based income, that is.

1 Like

A couple things I would be curious about:

  1. Which population segment is reporting increasing depression numbers, or is it equal? I would hypothesize this matches data I found that suicide rates decrease in step with wealth increase.

  2. How is passivity defined, and is a world view requiring struggle for survival having a hard time grasping abundance? This steps outside AI a little for now, but if you don’t have to work to eat, do your values shift? And when does different become bad? Depression and suicide would be, but I’d loop back to question 1 and other studies showing suicide reduction amongst wealth.

    I would suggest the definition of the “human condition” is a manifestation of how we live now, as it should be, not necessarily an accurate blueprint of all or only what we are or could be in a different scenario. We are looking at chapter two from a place rooted in chapter 1.

I think we have this scenario today. Some people “work” to maintain a base level of existence and then you don’t see them do anything of value. Sometimes this might be due to a demanding job and long hours and there isn’t anything left, sometimes it’s because some people just don’t have much to offer, innately. They’re already miserable.

I disagree. A big threat I see in AI is the potential to rewrite how we are governed in a way that we do lose control of our own autonomy. I certainly wouldn’t want to give government the ability to step in to my life and make me do things, just because. This would be the antithesis of achieved freedom.

This ties in. It’s very possible our entire existence will be filtered through AI at some point. If we think syndicated media messaging is bad, AI will be infinitely worse imo. Who is setting guidelines and controls, what information will be available outside of AI moderated allowances, If activity will be forced, who curates acceptable activities et cetera? To me this would be the dystopian nightmare. Nothing is real or at least full context is hidden, we have an arbitrary list of rules to live by et cetera.

I think this is actually a practical question we are addressing as a society now. Everything else so far has been a speculative “what-if”.

I would relate the current iteration of AI to the Industrial Revolution in this regard. Yes, things are changing. Progressing even, if the goal of moving from hunter-gatherer up the pole to modern humanity is considered good. I don’t think we know how it ends yet but if we look at trends in what exists now, jobs are shifting with some relatively menial roles being eliminated while others are created. Tech has been an economic driver for a long time, and it seems this will continue to be true.

My biggest concern personally isn’t how AI as it exists now will affect our landscape. It is causing shifts but nothing outrageous. I’m more concerned with how ubiquitous it might become as our lense to existence vs an interesting tool, to the extent our brain literally grows in to and syncs with a computer processor for example. I don’t think we could really even consider ourselves human anymore. And, would our brain power the system plugged in to it, or vice versa?

In this specific example, coal gave way to oil as a primary energy source. Gasoline fueled combustion engines were a technological advancement, and so were natural gas powered energy plants. While the decline of coal was detrimental to a subset of people in Appalachia, the oil industry grew to existence and created abundance over what coal had done. From a macro level, change created increase and further enabled energy advancement and all that came with it. This is objectively progress, but the shift would be provincially viewed as negative by a geographically localized generation who saw what they thought was a sure thing end, or effectively reduce to an end of sorts.

I do think adaptability will be key moving forward.

People have always looked for meaning through faith, and there have been many, many religions through our history. Each ending with the culture effectively hosting it. I think it’s possible we will see this too, with new religions or some system of “greater than self” emerging.

As a supplement it’s potentially great, as a broad replacement for human’s/ human common sense decision making etc it’s going to be a huge disaster!

1 Like

Good post. I would like to comment more, but again I have real life responsibilities. But one thought: is this necessarily bad?

As we know the progress of sensing and building reality with our senses and brain is ultimately a flawed process. Humans are limited and selfish. Getting more precise and objective view from tech aided senses/brains would make us less human, but maybe something more?

1 Like

LOL, AI is only “good” and will reward the very few that actually know what to do with it.

The rest, will be forced to juggle more shit and continue to be burnt out.

It’ll get to a point where people will forget how to even do a process manually and have to rely on AI or someone else to fix it.
People will also have to work every waking minute they have because they will be juggling more shit, more so, they will be managing more AI agents and tasks.
Not that that’s much different than now but it’ll be more expected now.

If you work for any company now, you are seeing that they are pushing AI on you if you want to remain relevant. And that is capitalism.