Could vs. Should: What Jurassic Park Got Right About AI
(00:01.048)
Jurassic Park and AI. Ian Malcolm played by Jeff Goldblum in Jurassic Park. One of my favorite characters as far as action adventure movies are concerned, without a doubt. Some great one-liners in there. And one of my favorite that I’ve often used here on the program was your scientists were so preoccupied with whether or not they could that they didn’t stop.
to think if they should. And you could take out the word scientist, you can put in investment, bankers, you could throw a myriad of things in there. And it’s applicable to many, different things. I have put out warnings over various different bits of technology over the years and the
potential for misuse and what could possibly happen. Go back, go back, I mean going back to the early days of MP3s, MP3s and Kaza and the free, was Napster was out there and the problems, I remember Metallica and some of the bands were like, this is ridiculous, we can’t make any money on our work anymore.
And you can watch. You can go back to that point in time where everybody decided to stream everything and everything was going way cheap and people are not buying albums anymore. They’re just downloading entire stuff. you can see the most certainly the decline in the quality of music. I’m sorry. It is what it is. know, the rise of auto tune and fake artists and all of this other
garbage that’s coming out and now what we’re going to have is we’re going to actually have AI music. Number one country song in the United States last week was an AI song. No people involved. And I went back then and I, you know, I talked about music because my love of it is the relationship you had with it. some people like myself, you know,
(02:28.108)
back into vinyl once again, know, holding it, the art, all the things that went along with that and that’s been taken away. I don’t think Steve Jobs fully realized what his device would do and how it would go about changing things because it has, without a doubt. Anyway, on from that. We can move on to social media. We here on this program were
Upfront honest, I remember I remember being a I was a did a guest appearance on the CW Daily Buzz program back in the days like 2007 to like 2010 I was doing once a week I’d have to go out to Orlando and do the show there and I remember there was a battle between Justin Timberlake and Austin Kutchner who could get more followers on Twitter when Twitter first came out and I said this is not gonna end well
It’s not gonna end well. I mean, you think about Facebook, you you watch the movie, if you’re not familiar with the story, was basically a program was putting together to meet girls. And it turned into something, quite frankly, that I don’t care for. Do I have to use it? Yes. Do I interact on it? I don’t. Much to chagrin, people that, you know,
put together this show and my I am not getting into conversations and feedback and doing that because I just don’t think it’s a very humane forum by any stretch of the imagination. I don’t. It’s ungodly to me to interact with people like that and what people have taken it to and how they treat others. know, the things that people will say online. There was an old thing when we were kids, say it to my face. Most of them would never do that.
And again, that’s a bit of a problem. And I think it’s done more. I gave the comparison to the pink slime from Ghostbusters 2, the negative energy that has just permeated this country and pitting people against one another. And those algorithms as well, they’re pretty smart. They know what to do. They know what to feed people. They know what to say or what to put in your feed to get you going, to get you upset.
(04:51.982)
All of those things you get you clicking on things. So again, you get more eyeballs and they can make more dollars. And I understand the concept of advertising and ratings and back in the day, but this, this my friends is a little bit ridiculous and we all understand the whole dopamine hit bit that it gives you and how it affects people. Now we’re on to AI and this is a pretty
pretty powerful tool. I’ve said this before, I use AI for search. First of all, looking certain things up. I use AI to take like a transcript of one of these programs, to put it in there and to clean it up. I don’t allow AI to think for me by any stretch of the imagination.
Greg Ip had a piece in the Wall Street Journal. The most joyless tech revolution ever is AI making us rich and unhappy. Discomfort around artificial intelligence helps explain the disconnect between a solid economy and an anxious public if people are concerned. They don’t know whether or not they’re going to be able to hang on to their jobs. They don’t know whether or not they’re going to end up being replaced by AI and what it all means. And we’ve had quite a few apocalyptic movies when it comes to artificial intelligence.
over the years and what could happen. I just rewatched Blade Runner, a genius movie, it was 1982. Again, AI applied into robots that are put out there. We all know war games. You could talk about Terminator. There’s so many of them that are out there when these things can get out of control and we’re already seeing certain AI.
Programs basically wanting to self replicate protect themselves from being shut down We’ve seen the various different hallucinations that some of these AI things have where they’re just making stuff up
(06:59.31)
making stuff, making court cases up, doing various different things. So they’re not reliable. Again, as a tool that you have to pay attention to and you have to be aware of and what it’s capable of doing. Several years ago, I guess it’s gotta be going back to like 2016, 2017 here on the program. I spent some time talking about this new thing that was coming up, was called Deepfakes, where it was…
since it was early AI where you could basically fake videos and fake photographs and it was pretty obvious back then you could you could see it. Not so much. Anymore, I for the life of me don’t understand why some of these apps like Sora. Where you can take somebody’s picture and likeness and you could
make them do horrible things and put it out there. And there’s no water stamp on it. There’s nothing on it to say, this is not real. Anybody, you kind of study all of the nonsense in the lead up to World War I and the assassination of the Archduke Ferdinand, you know, obviously, you the, was a Serbian separatists and you got the Austrian-Hungarian
Your crown prince is to be there. All sorts of issues. Then it just got out of control. Various different information flying around that wasn’t true. You’re talking about the ability to
push people in a certain direction via fake videos or fake, whatever it may be. is a better example of this quite frankly, is the film Wag the Dog, which was it’s genius. Great movie, Robert De Niro, Dustin Hoffman, Woody Harrelson’s got a little small part in it as well. And how the government.
(09:05.806)
And this is before all of this technology, know, manufacturers of fake war against Albania to get the president, you know, out of trouble.
You can see how this stuff could work like this. South Park did an episode on it last week and some of the things that this thing can do. You know, it’s amazing. You think about all of the danger. You can see the danger that goes along with some of this stuff and nothing being done. Let me give you an example. Father Mike Schmitz, he’s a bit of a rock star and Catholic.
circles. He’s a part of the Halo app and Ascension and he does just unbelievable things from all around the country. People are using deep fakes and fake videos of him asking people for money. He had to put out a whole thing. He said, I’m not doing this. This is not me. People using his likeness to go ahead and do that scares the crap out of me.
It does.
It scares the crap out Somebody could take my likeness and put it out there and say that I’m recommending this, that or whatever it may be and people can act upon that. Why are we not doing anything to rein this in? You don’t think that this is a danger for crying out loud?
(10:38.474)
I mean, we don’t allow kids to drink until they’re 21 here in the United States. Sure, they can go off and they can kill themselves in some neocon war, but they can’t drink until they’re 21, but we allow something as dangerous as this is not more dangerous.
I am, you know, the big alcohol lobbyists just, you know, basically wrecked the hemp industry here in this country because they want people drinking more again. And you mean to tell me that that’s more dangerous than this?
You don’t see the tremendous potential for all sorts of nefarious, if not deadly things that could happen with this. And we all know that in our society that a lie will make its way millions of times, millions and millions of times out around the globe before the truth comes out the front door.
Was it Ben Franklin who said that you know? The lie will make its way around the world around the country again. It was his point in time now with with technology It’s it’s going viral millions of times around the globe before the truth gets out the front door You see the type of harm? That could be done with this
and we’re just taking a step back.
(12:18.586)
You take a look at this Sora app and what it can do. And South Park parodied this this past week with the kids on South Park putting out horrific videos about each other and putting it online doing, OK, you got to watch it. You know, I’m not going to describe the things on the show that various different cartoon characters were raping kids and all sorts of stuff against South Park’s a little demented. neither here nor there.
There’s that other line from Jurassic Park. What was it Ian Malcolm saying? He was talking about the awesome power of genetics. Most of awesome power in the world and you guys are wielding it like a kid is wielding his father’s gun in the house. You don’t think that this in the hands of kids?
You don’t think the type of bullying that could take place with things like this, you don’t think that this is the slightest bit dangerous at all?
(13:19.82)
Listen, people, I don’t know what to tell you. I’m not a big rules and regulation guy, but I also believe, again, I like to think that we live in a world where people can police themselves, but we don’t, okay? We don’t live, it’s post-Christian world we’re living in right now. Most people don’t fear God at all. They don’t. They don’t, it’s again, they’re do whatever they want. It makes it very difficult to be a
Libertarian, you know type of a guy when we live in a world where that many people cannot police themselves You know, I don’t know how you’re gonna put the genie back in the bottle with this. You probably aren’t but Don’t you think we should have serious? Penalties for messing around with people’s likeness serious repercussions J I’m talking serious jail time type stuff
I mean, you could destroy someone’s reputation and you know how much we got to fight to get that back after something like this?
Again.
just saying, I’m just warning you and again I’ll give you another Ian Malcolm quote, boy I hate being right all the time.
(14:45.516)
Watch Dog on wallstreet.com.

