What if AI could replace your creativity? Discover the implications of artificial intelligence in education and creative fields as we sit down with graphic artist Benjamin Jancewicz and Dr. Devon Taylor, Ed.D., from Shenandoah University. Listen in as they share their insights into using AI in academia, supporting students, and preparing them for the workforce.
Explore the potential dangers of AI taking over creative work, the ethics of labor and automation, and the importance of teaching younger generations how to use AI as a tool. Our guests also discuss the positive side of AI, enabling more creative expression, and tackling bigger issues in the world. We dive into the complexities of AI and its impact on various industries, highlighting the essential role of the human element in this ever-evolving technological landscape.
As the use of AI continues to grow, we touch upon the potential implications of this technology in the 2024 US election, including the risks associated with deep fakes, imagery, and video manipulation. Join us for an engaging conversation that delves deep into the world of AI, creativity, and the future of our industries.
More from Benjamin:
Business: Zerflin.com, @Zerflin on all socials
Art: Art.Zerflin.com
Personal: BenJancewicz.com, @BenJancewicz on all socials
More from Devon:
Twitter
LinkedIn
Links from the show:
Question or Comment? Send us a Text Message!
Contact Us
Enjoy Basic AF? Leave a review or rating!
Intro Music: Psychokinetics - The Chosen
Show transcripts and episode artwork are AI generated and likely contain errors and general silliness.
00:00 - Intro
01:13 - Say Hi to Benjamin Jancewicz
03:23 - Say Hi to Dr. Devon Taylor
03:47 - AI, We Need to Have a Talk
01:05:35 - Closing
WEBVTT
00:00:00.059 --> 00:00:01.606
Dude, you're old, i just want to point that out.
00:00:01.645 --> 00:00:15.266
Say in KCKC Welcome back to the basic AF show With Tom Anderson and Jeff Battersby.
00:00:15.266 --> 00:00:18.981
Jeff, it is, as always, good to see you again, don't lie.
00:00:18.981 --> 00:00:22.789
Tom, we know it's not good to see It is.
00:00:23.339 --> 00:00:24.644
We're 100 percent certain of that.
00:00:24.644 --> 00:00:28.368
Yep, you say it, but just because you want me to stick around, i get it.
00:00:28.469 --> 00:00:29.513
Thank you Manipulator Great.
00:00:33.344 --> 00:00:35.750
As if I don't already have a few of those people in my life.
00:00:35.750 --> 00:00:39.649
We're also happy to welcome two guests.
00:00:39.649 --> 00:00:42.008
Tom, we have two guests on the show We do.
00:00:42.008 --> 00:00:49.033
First of all, we have Benjamin Young-Savage, who will introduce himself in just a brief moment.
00:00:49.033 --> 00:00:52.469
Then we have also Dr Devin Taylor.
00:00:52.469 --> 00:00:55.106
Sorry if I put too much emphasis on the doctor.
00:00:55.106 --> 00:01:02.881
There's one smart guy in the room and it's you, and we want to make sure everybody knows that that's the case, i wouldn't go that far.
00:01:06.400 --> 00:01:08.507
What we'd like you guys to do is introduce yourselves.
00:01:08.507 --> 00:01:13.605
Benjamin, why don't you tell us a little bit about you and then we'll talk about what our subject is for the day?
00:01:14.147 --> 00:01:22.200
Sure, i am tuning in from a tiny village called Kawawachi Kamachi, which is where I grew up in Northern Quebec.
00:01:22.200 --> 00:01:23.504
Say that again, please.
00:01:23.504 --> 00:01:27.585
Kawawachi Kamachi, it means by the bend in the lake.
00:01:27.585 --> 00:01:33.390
All right, it's a tiny Native American reservation up in the subarctic of Northern Quebec.
00:01:33.390 --> 00:01:36.103
I come up here.
00:01:36.103 --> 00:01:41.742
I try to come up here at least once a year just to reconnect with people I grew up with in the tribe.
00:01:41.742 --> 00:01:47.709
I'm stranded up here because of the forest fires.
00:01:47.709 --> 00:01:55.162
I'm sure most of you down south as we say, upstate New York and everywhere else have seen the orange skies.
00:01:55.162 --> 00:01:59.031
That is because things are burning in western Quebec.
00:01:59.031 --> 00:02:06.903
There are also small fires all over the place, and it has stopped the train line, which is the main way in and out of here.
00:02:06.903 --> 00:02:12.890
We've got fiber optic and so I can do podcasts.
00:02:13.461 --> 00:02:14.104
Yeah, look at you.
00:02:15.580 --> 00:02:20.650
I've been a graphic artist for the past 24 years.
00:02:20.650 --> 00:02:27.031
I run my own small business doing that.
00:02:27.031 --> 00:02:30.909
We have a whole handful of employees.
00:02:30.909 --> 00:02:33.145
The name of the company is called Zerflin.
00:02:33.145 --> 00:02:50.730
We specialize in branding and web design and we did a lot of pioneer work just as a small boutique agency In the 90s making adaptive websites that fit to your browser or phone or whatever you're viewing it on.
00:02:51.270 --> 00:02:52.332
They're cool Yeah.
00:02:53.234 --> 00:02:54.415
All right, good to be aboard.
00:02:55.222 --> 00:02:56.826
Yeah, thanks for joining us.
00:02:57.048 --> 00:02:57.528
Appreciate it.
00:02:57.528 --> 00:02:58.984
And Tom, you want to introduce your friend.
00:02:59.425 --> 00:03:00.729
Yes, as if you had one.
00:03:00.729 --> 00:03:04.725
That's the only one and a half next to you, jeff, you guys are friends.
00:03:04.725 --> 00:03:07.687
Well, it's a perfect thing.
00:03:07.687 --> 00:03:10.419
It's tumultuous.
00:03:10.419 --> 00:03:15.591
We're off the rails a minute in Good.
00:03:15.591 --> 00:03:17.681
So for joining us today.
00:03:17.681 --> 00:03:24.526
Dr Devin Taylor from Shenandoah University, colleague of mine, devin, welcome to the show.
00:03:25.040 --> 00:03:25.683
Yeah, thank you.
00:03:25.683 --> 00:03:29.189
Very nice to be here today to talk about this subject.
00:03:29.189 --> 00:03:34.110
I'm excited to talk with you folks and see what we can come up with.
00:03:35.061 --> 00:03:38.350
Yeah, we'll come up with something we usually do.
00:03:38.699 --> 00:03:42.188
We're going to solve all AI related problems here on the show.
00:03:42.188 --> 00:03:43.050
Absolutely correct.
00:03:44.040 --> 00:03:46.068
And actually, benjamin, you brought up the big point.
00:03:46.068 --> 00:03:54.185
So what we're talking about today is artificial intelligence, which, personally, i would put the emphasis on artificial.
00:03:54.185 --> 00:04:00.871
Some others in this room might put it on intelligence or some combination in between.
00:04:00.871 --> 00:04:03.046
So what we want to do is we want to talk about this.
00:04:03.046 --> 00:04:42.005
Benjamin, you, as you said, work on the graphics, graphic arts, and there's obviously some potential problems for artists with regard to this, and I just want to point out that our beautiful new artwork that we currently have came about because Benjamin poked me after having created some initial artwork that we had for a while using Diffusion Bee, And he asked me whether or not I had created that using any kind of AI, and I had, and I wonder what gave it away.
00:04:42.701 --> 00:04:46.651
Oh, it's real easy to spot if you know what to look for.
00:04:47.519 --> 00:04:49.305
The people with 10 fingers on one hand.
00:04:50.980 --> 00:04:55.930
I just think it's ironic that you chose a human to draw a couple of robots as your look.
00:04:56.211 --> 00:04:56.492
Is that?
00:04:56.492 --> 00:05:00.627
Well, that's closer to the reality, I think, is the story as we are.
00:05:00.627 --> 00:05:03.283
We did that, So we did end up.
00:05:03.283 --> 00:05:09.509
we tried to hook up with Benjamin to get him to take care of it, but time and other things didn't work out.
00:05:09.509 --> 00:05:12.988
So we had, as everybody knows, Randall Martin designs.
00:05:13.569 --> 00:05:14.232
Beautiful design.
00:05:14.620 --> 00:05:17.887
Yeah, really I'm happy, we are really happy with it to do that.
00:05:17.887 --> 00:05:33.889
And, devin, you are working in education and trying to find ways to lean into AI as a means of being able to either support students I don't want to put any opinions in anybody's mouths.
00:05:33.889 --> 00:05:43.470
What I'm interested to hear is kind of what your thoughts are on it and how, and we'll start with you, devin or doctor, whatever you want me to call you.
00:05:43.939 --> 00:05:45.095
Devin please.
00:05:49.005 --> 00:05:53.413
There was a guy once that used to make us all say his name was Doctor.
00:05:53.413 --> 00:05:55.548
He was an HR guy at the company I used to work for.
00:05:55.548 --> 00:05:56.021
I would.
00:05:56.021 --> 00:06:00.411
The only time I would refer to him as Doctor is if I used evil at the end of it.
00:06:00.411 --> 00:06:02.322
So he was.
00:06:02.322 --> 00:06:12.252
He was a I won't say the word here because we don't want to down our rating and make it but he was not a kind fellow.
00:06:12.252 --> 00:06:30.615
Anyway, devin, what we want you to do is why don't you lead in and let's talk a little bit about, first of all, your experience with AI, how you're using it and kind of you know where you hope to go or how you hope to control it in your particular environment?
00:06:32.000 --> 00:06:32.139
Yeah.
00:06:32.139 --> 00:06:46.521
So, yeah, i think I'm thinking about it in two different ways when it comes to AI and higher education, and the first is students using it to help them create right Content.
00:06:46.521 --> 00:06:55.992
you know, these, these large language models, are really good at creating content, whether that be images, text, soon to be video and audio right.
00:06:55.992 --> 00:07:19.108
So thinking through ways that students can use that as a starting point for some of the things that they're going to be working on in their classes, but, on the other side, also thinking about how do we need to start training our students in these tools to be prepared for entering the workforce, where they're likely going to be using these tools in some fashion.
00:07:19.108 --> 00:07:33.411
So thinking about it in two different, two different aspects there right, preparing our students for the workforce, while, at the same time, thinking about how we're going to use it to have them complete assignments and work on on different aspects of higher education.
00:07:34.254 --> 00:07:34.574
Interesting.
00:07:34.574 --> 00:07:42.333
So let me ask you a question about that, and this is one of the things that comes up for me with regard to particularly in education.
00:07:42.333 --> 00:07:56.072
you know, obviously one of the things about higher education is training people, hopefully, how to think you know, teaching them how to be critical thinkers and work in that kind of way.
00:07:56.072 --> 00:08:22.773
I know Dr Holland, who's the professor that I spoke of a few minutes ago when we were, when we were just talking in the background I don't think we recorded that But one of her concerns is, and the one that I would want to express to you is she's concerned that you're going to have issues where the thinking piece is kind of thrown out the window.
00:08:22.773 --> 00:08:23.562
The process.
00:08:23.562 --> 00:08:41.163
say, if you're writing a paper where you're coming up with ideas and you know your initial points to put together some kind of project, you, if you're, if you're having the AI start out that thinking for you, are you doing the groundwork that needs to be done?
00:08:41.163 --> 00:08:45.828
So I'm curious to know how you balance that.
00:08:45.828 --> 00:08:47.432
you know in academia.
00:08:48.942 --> 00:08:52.351
Yeah, i think you know it's a starting point, right?
00:08:52.620 --> 00:08:57.672
It should always, i think, be taken as this is a place to start or even a place to critique.
00:08:58.399 --> 00:09:21.389
So we have a history professor that started every class this last semester by prompting chat, gpt with whatever the topic of the day was, and then they would spend the first 10 minutes seeing what it got right, what it got wrong and kind of critiquing the large language model, so that then they could learn from you know what it's doing well or what it's completely off the rails on.
00:09:21.389 --> 00:09:24.148
And I think that's a learning experience for the students as well.
00:09:24.148 --> 00:09:31.177
They see that it's not always correct and it should never be taken at face value for the things that it spews out.
00:09:31.177 --> 00:09:32.524
Right, because it's just doing math.
00:09:32.524 --> 00:09:41.451
It's not knowledgeable, right, it's not intelligent, it's just doing math to predict what the next few words or sentences are going to be.
00:09:41.451 --> 00:09:52.706
So I thought that was a really interesting use case, you know, for kind of getting students used to how these things work, as well as kind of showing them that they're not always the end all be all when it comes to content.
00:09:53.519 --> 00:09:58.102
Yeah, i appreciate that actually, And I appreciate that you say that it's not intelligent, it's iterative.
00:09:58.102 --> 00:10:00.971
you know, that is definitely, to my mind, what it is.
00:10:00.971 --> 00:10:02.043
It's iterating on.
00:10:02.043 --> 00:10:09.605
you know what the next possible word is in a sequence, whether or not there's truth to that or not.
00:10:09.605 --> 00:10:13.548
So that makes a lot of sense to me.
00:10:13.548 --> 00:10:27.606
Benjamin, i you know your thoughts about this have to do with art and the creation of those kinds of images.
00:10:27.606 --> 00:10:39.370
I guess the question I have about that I mean, you said that our original art definitely looked like, you know, looked like a bad album covers from the 70s is what it really looked like.
00:10:40.982 --> 00:10:42.005
But I mean, it wasn't.
00:10:42.005 --> 00:10:42.928
It wasn't great.
00:10:42.928 --> 00:10:44.345
No, it wasn't great.
00:10:44.345 --> 00:10:46.225
What you have now is great.
00:10:47.087 --> 00:10:47.229
Yeah.
00:10:47.229 --> 00:10:56.504
So why don't you talk a little bit about the difference between what is great, what isn't great, what you see I mean I look at some of the stuff that I see.
00:10:56.504 --> 00:11:17.914
In fact, tom sent me a link last night of a picture it was four different pictures, all AI generated of supposedly a Native American from Alaska, and you know the difference in view that at first glance, looked like legit photograph.
00:11:17.914 --> 00:11:23.432
I mean, i would have said it looked like a legit photograph of a real human, you know.
00:11:23.432 --> 00:11:24.642
No crooked eyes.
00:11:24.642 --> 00:11:27.650
No, you know seven fingers and three toes, kind of thing.
00:11:28.442 --> 00:11:29.725
Yeah, it's in a matter of months.
00:11:29.725 --> 00:11:30.908
It's gotten better at that.
00:11:31.629 --> 00:11:33.620
Yeah, So what?
00:11:33.620 --> 00:11:34.283
what is it?
00:11:34.283 --> 00:11:41.562
do you see any benefits to using AI for the creation of art, Or is it all downside to you?
00:11:45.102 --> 00:11:49.753
Well, fundamentally I agree with what Devin was saying is not intelligent.
00:11:49.753 --> 00:11:53.568
So I think we need to establish that as a as a baseline.
00:11:53.568 --> 00:12:05.572
It's not intelligent yet There are people who are working on intelligent, but what most of people are talking about when they're talking about AI generated things, it's not intelligent.
00:12:05.572 --> 00:12:30.136
What it's doing is it's using repositories of information to generate things based on other people's work, and I think that's important to to establish, because you know the, the stuff that you're viewing is based on something somebody else did, a person.
00:12:30.136 --> 00:12:50.511
Ultimately, now, it's very, very, very, very finely remixed, i would say, to the nth degree, where you, where at least the creators of these programs, are not interested in attributing, so they don't actually do the tracing to to tell you what percentage of an image is.
00:12:50.511 --> 00:12:55.130
You know from what, certain pieces, what certain bits of artwork.
00:12:55.130 --> 00:13:15.640
But, yeah, the biggest issue that I have is that, especially in the past, even year or so, is that this AI generation has taken leaps and bounds in directions of of things that people love doing.
00:13:15.640 --> 00:13:25.434
Creating artwork, photography, designing things is something that people love to do.
00:13:25.434 --> 00:13:27.466
I switched majors because of it.
00:13:27.466 --> 00:13:33.672
I was originally in engineering and I switched because I love graphic design so much.
00:13:33.672 --> 00:13:42.022
I would honestly do it for free, but I'm fortunate enough to have people pay me for it And it it.
00:13:42.783 --> 00:13:52.743
It seems a shame that these tools are being used to replace people who honestly love doing this stuff.
00:13:52.743 --> 00:14:12.605
I do think that you know there are you know there are you know there are you places and areas where it can be used as a tool, and I think that Devin is on the right track of making sure that you know these younger people are learning how to use it.
00:14:12.605 --> 00:14:22.210
but the album cover that you guys put up before was pretty obviously AI generated.
00:14:22.210 --> 00:14:25.187
To me, there was a lot of different tells in that old version.
00:14:25.187 --> 00:14:39.447
I bet you, if you did the same experiment today and tried to recreate and put in the same exact thing, you would get a much more realistic and much more defined result today, just because of how much it's grown and changed since then.
00:14:41.134 --> 00:14:49.682
But yeah, i'm firmly on the camp of you know, let's use these tools to do things people don't want to do, rather than do things people do.
00:14:49.682 --> 00:15:34.986
The other point is that, especially given the the subject matter where you were talking about images of northern northern native people, i am here on a reservation where I recently redesigned the website for the tribe, because they are underrepresented, because they don't have as loud of a voice as they should have, and people are using things like AI generated imagery and texts to be lazy and not to actually go and get those photographs and stories on their own, which is what they should be doing because these cultures are disappearing.
00:15:35.649 --> 00:15:39.201
So yeah, interesting, very interesting.
00:15:39.280 --> 00:15:46.184
So you're in First Nations country, right in yes, yes, i am on a Nascapi First Nations village right now.
00:15:47.407 --> 00:15:47.567
Yeah.
00:15:47.567 --> 00:15:58.436
So this raises for me an interesting question and one of the things that's kind of challenging and, devan, i'm curious to know what you think about this.
00:15:58.436 --> 00:16:24.429
One of the problems for me is with any generation is it's currently one only coming from data sets that are publicly available, in other words, for academia, every single article that maybe a research article and there's arguments to be had about this as well but are typically any, you know, journal papers, anything like that typically paywalled.
00:16:24.429 --> 00:16:28.831
You know, behind some type of paywall that you have to pay to get actual research data.
00:16:28.831 --> 00:16:30.355
That's that's going to be live.
00:16:31.947 --> 00:16:57.736
What it is feeding off of, so far as we know and that's another problem with this for me is we don't know exactly what data sets are being fed into these, you know, into these AIs, but one of the things that we do know is that most of what it's getting is coming from the broader internet, which we all know as the frigging Wild Wild West, maybe worse than the Wild Wild West it might be.
00:16:57.836 --> 00:17:00.033
You know very nearly post Weimar Germany.
00:17:00.033 --> 00:17:07.374
You know in some aspects, or some you know places on the on the web.
00:17:07.374 --> 00:17:20.810
So I really love Devin that, what that history professor did, which was, you know feed, chat, gpt, and then see the BS that gets propagated by that or some truth.
00:17:20.810 --> 00:17:46.036
And that's, i think, the problem, or, from my perspective, one of the issues with it is you get you know half the information is correct, half the information is a lie, or you know made up, and then you know being able to think about and figure out, you know parse where the lie is, where the truth is, especially when there is no access to academic data.
00:17:46.036 --> 00:17:47.945
So what do you think about that?
00:17:47.945 --> 00:18:00.268
how do you, how do you resolve the issues, at least, that we currently have with the, with the data sets only coming from publicly available sources?
00:18:01.511 --> 00:18:02.273
yeah, it's, it's.
00:18:02.273 --> 00:18:08.478
It's a tough problem, right, because we don't know, at least for most of the the language models.
00:18:08.478 --> 00:18:14.163
We don't know where the data is from and most of them are probably using a lot of the same data sets.
00:18:14.163 --> 00:18:23.382
Right, it's everything that's publicly available on the internet, and so smart companies are locking down their data and saying you don't get this anymore.
00:18:23.382 --> 00:18:25.509
Right, this is now proprietary.
00:18:25.509 --> 00:18:27.999
We're gonna start rolling our own right?
00:18:27.999 --> 00:18:32.175
for instance, reddit has not allowed anybody to scrape their data.
00:18:32.175 --> 00:18:34.183
They have a huge data set, right.
00:18:34.183 --> 00:18:40.192
That probably would be extremely beneficial when it comes to questions and answers and and other things.
00:18:40.493 --> 00:18:44.450
Right that that Reddit's really good at product reviews, things of that nature.
00:18:44.450 --> 00:18:50.832
So, i mean, i think we're still very early days, right, we've only had chat GPT since November.
00:18:50.832 --> 00:18:57.637
Right now, gpt has been around in beta for maybe a year and a half, right, people have been playing around with it.
00:18:57.637 --> 00:19:13.461
So I think it's still really early days with this technology and, like Benjamin said just in the past, you know, couple of months these things have gotten so much better on the image side, but also on the you know, on the text side.
00:19:13.461 --> 00:19:30.931
You know, there's a really interesting infographic that shows the advances of chat, gpt, which was, you know, gpt 3.5 to GPT 4, and the leaps and bounds in knowledge of what it's able to do when it comes to passing some tests.
00:19:30.931 --> 00:19:43.155
Right, it's, it's in the top, you know, 20th percent or 80th percentile on all of these different scores of the GRE, the bar exam, the medical exams.
00:19:43.155 --> 00:19:45.924
So they're advancing at a pretty fast pace.
00:19:45.924 --> 00:19:55.643
So I think eventually we'll get to the point where some of this gibberish and you know nonsense that it's spewing out will become not the norm.
00:19:55.643 --> 00:19:59.076
Right, it's gonna be kind of the the outlier out there.
00:19:59.076 --> 00:20:05.800
That will then be fixed right by the human reinforced learning that these, these training systems go through.
00:20:05.820 --> 00:20:07.690
Yeah, so that's interesting.
00:20:07.690 --> 00:20:31.566
We do have humans behind this looking at correcting, hopefully, the information you know, which adds some, perhaps subjectivity to the training of these models, and that subjectivity challenging, right, you know, you don't know who's, who's deciding, what's the truth in the, in the language models that are being used and that's brings a healthy dose of bias, i'm sure as well.
00:20:31.566 --> 00:20:40.587
Oh yeah, well, you know if I was training it, it would you know there's also these ethic questions.
00:20:40.647 --> 00:20:41.891
I know, i know, jeff.
00:20:41.891 --> 00:20:53.633
I sent you the article about the 150 Nairobi workers unionizing because they weren't paid enough, who were on the back end reviewing all the chat, gpt stuff.
00:20:53.633 --> 00:20:56.842
So you know it's.
00:20:56.842 --> 00:21:04.925
It's being used actively to exploit black workers right now, which is not a great look yeah, no, and that's.
00:21:05.067 --> 00:21:05.911
I think that's another.
00:21:05.911 --> 00:21:07.740
You know that's another interesting.
00:21:07.740 --> 00:21:16.078
Interesting question is you know, tom sent me an article last night about a radio station.
00:21:16.078 --> 00:21:17.267
Where was that radio station?
00:21:17.848 --> 00:21:31.329
I believe that was in Portland again who has a DJ that is now an AI DJ sounds just like the woman who is who is, you know, actually DJing the show live.
00:21:31.329 --> 00:21:35.348
Their thoughts are well, you know, she can't do it 24 hours a day.
00:21:35.348 --> 00:21:43.551
So if we have a big event come up, there's traffic issue or there's you know, big accident on some highway in Portland or something like that we just flip on the AI.
00:21:43.551 --> 00:21:47.467
But we're never, ever, ever gonna, you know, use this for real.
00:21:47.467 --> 00:21:51.499
Well, you know, that's BS, right, you know they.
00:21:51.598 --> 00:22:02.789
If, first of all, i will say for the 50 millionth time, you know, i, i don't like algorithm algorithmically, man, i'm in tech and I can't say that word.
00:22:02.789 --> 00:22:09.971
You know, algorithms creating playlists, for me, they never satisfy.
00:22:09.971 --> 00:22:42.368
You know, i listened to Radio Paradise, which is a human DJ DJ radio station based out of California, or was based out of California, but I'm never satisfied that the thing that we're getting and I think this goes to your point, ben is gonna be as beautiful or as real as something this human created, because there there is no, there's no, first of all, so there's no soul, correct, right?
00:22:42.368 --> 00:22:43.832
which maybe is tech in general.
00:22:45.015 --> 00:22:51.902
I agree with you, especially on the on the music front, but I also think that there is room for AI in that.
00:22:51.902 --> 00:22:54.430
I am not a Spotify user anymore.
00:22:54.430 --> 00:23:03.923
I jumped ship when they they wouldn't drop Rogan, but as we're talking on a podcast, but the Spotify doesn't carry.
00:23:03.982 --> 00:23:23.586
Nobody carries it, i really loved Spotify's new music algorithm, which did a great job of paying attention to what I was listening to and giving me suggestions of other artists that I might like.
00:23:23.586 --> 00:23:31.046
It wasn't it wasn't a Gestapo and it didn't add them to my playlist, but it gave me the option, said hey, you want to listen to these, you know you.
00:23:31.046 --> 00:23:33.153
You've been listening to classical for the past month.
00:23:33.153 --> 00:23:33.954
Would you like to try?
00:23:33.954 --> 00:23:40.087
you know these artists or would you like to, you know, listen to this techno or dance house music?
00:23:40.087 --> 00:24:04.567
because you've been, you know, working out to this and I do think that that algorithm was there, is really well based and and this isn't new either I don't know if you guys are familiar with the, the franchise Jack FM, but the radio station has been around for the past 10 years or so and it doesn't have a DJ.
00:24:05.670 --> 00:24:18.233
It plays music, the occasional commercial, but it is mostly just this one guy with pre-recorded announcements and they're region-specific.
00:24:18.233 --> 00:24:26.285
He's got stuff that talks about Baltimore or New York or wherever you're in, but they just play the hits.
00:24:26.285 --> 00:24:36.730
They play what they know people want to listen to, based on feedback from other areas, and that's all an algorithm and it's fun to listen to.
00:24:36.730 --> 00:24:46.144
It's not bad radio, it's just music, but there still is a kind of human behind the scenes running the thing, which I think is important.
00:24:46.700 --> 00:24:50.884
Yeah, interesting.
00:24:52.366 --> 00:25:03.624
I think that's, for me at least, one of the bigger parts of it is that human element, whether or not it's in academia.
00:25:03.784 --> 00:25:33.432
And you've got someone who's writing a paper Again, professor Holland and I'll make sure she listens to this or she knows she's been shouted out several times in this She'll tell you that it's pretty obvious when it is that someone has used one of these algorithms to write a paper.
00:25:33.432 --> 00:25:46.809
I think the other problem this might be an interesting question for you is that when these things have potentially come before academic boards, technically that would be plagiarism.
00:25:46.809 --> 00:25:58.808
If I'm not writing it and it's being written or mostly written or mostly even outlined by something like chat GPT, how do you address that?
00:25:58.808 --> 00:26:01.898
How do you address that from an academic perspective?
00:26:01.898 --> 00:26:19.009
Do you say, yep, everybody has the right to at least outline it this way and use chat GPT to at least write the skeleton around what it is that you're going to put together as a paper?
00:26:19.009 --> 00:26:37.815
How do you think Devin at Shenandoah and maybe at other universities, because you've probably spoken with other people in the same seat as you in these places how do you differentiate?
00:26:37.815 --> 00:26:40.663
How do you either?
00:26:40.663 --> 00:26:42.067
I?
00:26:42.166 --> 00:26:44.451
don't want to say punish, but maybe that's it.
00:26:44.451 --> 00:26:47.722
How do you handle these particular issues?
00:26:47.722 --> 00:26:48.144
Ben?
00:26:48.144 --> 00:26:54.846
just put a student on the chopping block, benjamin, sorry about that.
00:26:54.846 --> 00:26:58.614
How do you handle that kind of stuff?
00:26:58.614 --> 00:27:10.634
Some might argue that academia has lost its rigor, but how do you make sure that you're still being rigorous, that you're thinking?
00:27:10.634 --> 00:27:22.762
The one example that you gave about history is good, but what about me, the student who goes home and spits a couple of questions into chat GPT and makes that my response for what it is that I'm doing?
00:27:25.440 --> 00:27:31.863
These are the questions that we're having this summer and this past semester that all schools have been battling with.
00:27:31.863 --> 00:27:33.086
How do we approach this?
00:27:33.086 --> 00:27:37.564
Because you've got some professors that are gung-ho.
00:27:37.564 --> 00:27:45.167
They want to use this in their class, they want to integrate it, they want their students to become proficient in how to use these.
00:27:45.167 --> 00:27:49.788
Then you have other professors that are 100%, completely against.
00:27:49.788 --> 00:27:53.181
They don't want any of their students to do anything with a chat.
00:27:53.181 --> 00:27:57.369
Gpt Balancing syllabi.
00:27:57.369 --> 00:28:00.517
What does the syllabi say for the student?
00:28:00.656 --> 00:28:19.782
per course, rather than having something at the institutional level that either says yes or no, i think we're leaving it up to the individual faculty member to make that determination for their students and their own individual course as to how they want to approach this particular technology.
00:28:19.782 --> 00:28:24.175
Of course, we've got support folks that are there to help.
00:28:24.175 --> 00:28:33.933
If they want to think about ways to integrate it into that English class, then we've got some folks that are there to help walk them through ways that they could use it.
00:28:33.933 --> 00:28:39.750
If it's a creative writing course, i can think of ample ways that we could come up with.
00:28:39.750 --> 00:28:43.351
Okay, here's a list of 10 ideas for a particular story in this genre.
00:28:43.351 --> 00:28:46.803
Now, get rid of all of them, right, because that's what the AI came up with.
00:28:46.883 --> 00:28:53.910
Let's think outside of what the AI is coming up with, or coming up with screenplays or other things.
00:28:53.910 --> 00:29:01.332
There's ways to integrate it into the writing process that aren't write me a five paragraph essay on a topic.
00:29:01.332 --> 00:29:29.708
So, trying to make faculty think of ways that it could be used to expand what their students are capable of, versus just replacing what their students are doing, i think is the challenge that we have now, because everybody, i think, just thinks of all right, i'm going to type into chat gbt, write me a five paragraph essay on you know, the Shakespeare right, shakespeare's life, and then students are just going to submit that.
00:29:29.708 --> 00:29:36.294
But I don't think if you're giving assignments like that, then that's probably not a very good assignment anyways, right?
00:29:36.294 --> 00:29:49.692
So it's kind of thinking through how we assess students And I think it is causing us to rethink how we assess students in this age, because I think it's going to have to change a little bit.
00:29:53.445 --> 00:30:06.659
I think, honestly, academia is a little bit behind the ball, because this kind of the lazy student isn't anything new, you tell me, people buy papers.
00:30:06.679 --> 00:30:07.140
Is that what you're?
00:30:07.801 --> 00:30:07.961
doing.
00:30:07.961 --> 00:30:13.638
Yes, they buy papers already and change them and adapt them to their own writing.
00:30:13.638 --> 00:30:24.378
I mean, people will download a book review on To Kill a Mockingbird and spend 20 minutes putting it in their own words and submitting it as a paper and getting a great mark on it.
00:30:24.378 --> 00:30:28.070
So I mean, that's existed since I was a kid.
00:30:28.070 --> 00:30:37.945
There were papers that I looked up because I was stumped and didn't understand something and found a paper that was written by somebody else and it helped me understand it.
00:30:37.945 --> 00:30:44.938
I didn't pass it off as my own work, but I did look up stuff because I just didn't understand the material and I didn't have a very good teacher.
00:30:44.958 --> 00:30:49.425
But I don't think that.
00:30:49.425 --> 00:30:58.163
I think that in a large park, academia has been kind of ignoring that problem for a very long time.
00:30:58.163 --> 00:31:14.461
I'm glad that you're actually taking the bull by the horns and actually facing it front on and dealing with it and recognizing that the students that you are teaching are going to be exposed and to be using these tools and that they might as well learn how to use them in effective ways.
00:31:14.461 --> 00:32:06.233
But yeah, again, i think that it's all about the critical learning how to critically think, which is what I think Jeff was kind of alluding to is that that's what the purpose of education is, and I think that you know, even when I was a young student, the main reason why any of us would try to buy a paper online was because we were overworked and overwhelmed by classes and we actually did want to do a great good grade and we're trying to figure out all of the tools that would help us do so But because you know, it wasn't recognized or acknowledged by any of the instructors.
00:32:06.273 --> 00:32:12.513
I wish instructors had got up in front of the class and say hey, listen, i know you can buy essays online.
00:32:12.513 --> 00:32:14.800
The point is not the essay.
00:32:14.800 --> 00:32:16.486
the point is to think about the material.
00:32:16.486 --> 00:32:32.953
That's what I'm getting trying to get you to do, and if writing an essay is not the most effective way to get students to think about the material, then maybe the essay is the problem and there should be other ways to complete those tasks.
00:32:34.141 --> 00:32:35.738
Yeah, yeah, i'm sorry, go ahead, devon.
00:32:36.201 --> 00:32:44.989
No, i think you know we've been talking about the flipped classroom for a long time, where, you know, students ingest content outside of the class.
00:32:44.989 --> 00:33:03.829
They do their readings and then when they come into the class, it's like a secretive seminar where the professor's job is to really just foster a discussion about the particular topic, and I think it's those types of teaching modalities that are going to have to become more prevalent in this.
00:33:03.829 --> 00:33:18.064
You know, age of AI, because that's how you're going to be able to assess those critical thinking skills and what the students are really learning is by having those conversations, you know, one-on-one with the student or in a group context.
00:33:18.064 --> 00:33:28.673
So I think it's you know we've been working towards that type of classroom right, the flipped classroom for a while, but I think it's going to become even more important now.
00:33:29.599 --> 00:33:30.782
It's just harder for teachers.
00:33:30.782 --> 00:33:38.330
It's very difficult to assess a student based on class participation, especially when you have dozens of different personalities.
00:33:38.330 --> 00:34:00.743
It's much easier to grade a student based on a written essay And so you know, the teacher is often trying to do the easy thing for them by having people write essays and the student is trying to do the easy thing for them by, you know, getting the essay from somewhere else.
00:34:00.743 --> 00:34:19.786
So it's really about, you know, time saving and trying to get through the thing rather than actually absorb the thing, which I think is a huge issue, and I think it's at the root of how AI is being misused.
00:34:21.034 --> 00:34:22.501
Yeah, so two thoughts about this.
00:34:22.501 --> 00:34:43.682
First of all, devin, you said a few minutes ago that these AIs are able to pass things like the LSAT pretty easily or other tests that we currently use to get people into various forms of education, whether it's law school or medical school.
00:34:43.682 --> 00:34:48.965
It's someone who sucks at taking tests always have.
00:34:48.965 --> 00:35:01.561
You know, i have always been challenged that way, although you know I think that I was smarter than I ever thought I was because I sucked at tests.
00:35:02.434 --> 00:35:15.246
well, toot my own horn here, but or trying to toot a horn, that maybe doesn't need to be tooted, but I think this speaks to and to your point, devin.
00:35:15.246 --> 00:35:19.905
This speaks to the way that we view education.
00:35:19.905 --> 00:35:30.775
I know students will always come in to professors and I don't think this is as true in secondary education.
00:35:30.775 --> 00:35:42.880
In secondary education, but in oh sorry, in like high school education, but in college people that are expecting to get an A, the A is the thing right.
00:35:42.880 --> 00:35:47.661
It doesn't matter how you get to the A, what the A is or what that is.
00:35:47.661 --> 00:35:59.023
And I will say again, talking up Professor Holland, i only had this professor in graduate schools.
00:35:59.023 --> 00:36:14.061
In graduate school all of her classes were wrangling about ideas, you know, and then learning to take the things that you wrangled with and put them down, you know, on paper cheaters gonna cheat.
00:36:14.061 --> 00:36:19.561
You know there's no getting around that If you're not in school.
00:36:19.561 --> 00:36:23.342
And I think you know the way that we think about education is typically.
00:36:23.434 --> 00:36:26.905
I'm gonna get a job, then I'm gonna go make a million dollars.
00:36:26.905 --> 00:36:39.903
You know, that's kind of the idea rather than I'm coming to this place to play with ideas and learn something I don't know and get out of my comfort.
00:36:39.903 --> 00:36:44.458
You know, whatever my comfort space is to be able to hear things and learn things and think about things.
00:36:44.458 --> 00:36:50.065
So there's something to be said for I think.
00:36:50.065 --> 00:36:53.956
Anyway, you're not gonna get around the people that wanna cheat, you know.
00:36:53.956 --> 00:37:02.007
You're not gonna get around the people taking a shortcut to get to something good, like me using diffusion B to create a piece of artwork.
00:37:02.355 --> 00:37:06.461
Granted, we needed a piece of artwork yesterday when we were, you know, doing this the first time.
00:37:06.461 --> 00:37:09.324
That was the whole idea behind it.
00:37:09.324 --> 00:37:21.246
But people who are really interested in thinking, i think, are always gonna put their nose down.
00:37:21.246 --> 00:37:23.684
They're gonna be the ones that wanna play with the ideas.
00:37:23.684 --> 00:37:35.179
They're not gonna use things like chat GPT as a starting point or things such as chat GPT as a starting point.
00:37:35.179 --> 00:37:45.782
So maybe it offers an opportunity for us to rethink how we think about education and what it is and what its purpose is.
00:37:46.454 --> 00:37:49.255
But then again, the users of these tools, like it's.
00:37:49.255 --> 00:38:02.320
I think we've so far been talking about these tools as a very individual experience The one student choosing to or not to use these tools in order to cut a corner.
00:38:02.320 --> 00:38:13.541
The bigger danger with these tools is when they are used by large companies and industries to replace people.
00:38:13.541 --> 00:38:42.860
So, for example, you know the radio show that you talked about a lot of large companies are, instead of hiring a team of people to be a creative department, they are hiring one person and telling that one person to do all the things.
00:38:42.860 --> 00:38:45.661
That's been true for a while.
00:38:45.661 --> 00:39:12.338
I have had to become a master of many different things because of the industry that I'm working on, that being a creative writer, being an illustrator, a photographer, a videographer, a graphic designer, a typographer all of these things I am quite good at, but that's because the art department said various companies have gotten smaller and smaller and smaller and smaller.
00:39:12.338 --> 00:39:45.280
So essentially, the point that I'm trying to make here is that the biggest danger with these AI tools is where they are mixed with rampant capitalism, this society that we're living in and using to squeeze people out and to replace people, and not only just replace people but also pay them less because you can use chat GPT to write your blog post for free right now.
00:39:45.280 --> 00:39:59.822
Using a starter account means that you are less likely to hire a person to do that who would provide that soul, and you experienced that with this podcast.
00:39:59.875 --> 00:40:17.443
You guys needed an album cover because it needed to be done, and so the quickest way to get that done was not to hire someone and have them invoice you and talk about your creative ideas and do this iterative process which ultimately gets you this thing that you're gonna love.
00:40:18.215 --> 00:41:08.768
It's easier to just dump that into a AI image generator and just throw up whatever it spits out and have that as your album cover, and I think that's really the crux of the issue that I have is that, in a society where people were taken care of and you didn't have to scrape by in order to pay your bills, i think that AI would be a fantastic creative companion to being able to make anything that your heart could desire, but as it stands right now, it's being used as a tool to replace people who really love what they do, which is terrible.
00:41:08.768 --> 00:41:23.402
I wish AI was being used to actually solve some of the bigger issues that we have AI used to solve not just like energy crises, but global warming, or even like even super complicated things like racism.
00:41:23.402 --> 00:41:40.121
I actually think that AI could be used in that direction, to work on those issues, and instead they're using it to create sketches for a storyboard and replacing the person who loves making storyboard sketches.
00:41:42.494 --> 00:41:49.740
Yeah, which is part of what's going on with SAG-AFTRA right now, and the writers And, to some extent, the actors.
00:41:49.740 --> 00:42:02.721
You've got those real, those things being seen as real issues, where you've got real human beings that can create something And that there's value in it.
00:42:02.721 --> 00:42:07.976
At least we'd like to believe that there's value in that creative process I do.
00:42:07.976 --> 00:42:12.748
That's the heart of where I live and breathe.
00:42:12.748 --> 00:42:37.480
And yet AI for guys like Tom and me, when we're working our day jobs, I'm mostly working a day job being able to set up a network and secure that network, Whereas where that would be good to have a human doing that, it's also good to have something in the background creating those kinds of things.
00:42:38.974 --> 00:42:48.822
So I do think there's maybe a flip side to that, too, where I mean these tools are democratizing content creation, right.
00:42:48.822 --> 00:43:05.942
I mean they're making it so that me, who can barely draw a stick figure right, could create something for my personal website right An image or a starting paragraph, a bio about some particular thing.
00:43:05.942 --> 00:43:13.563
So I do think that it's enabling folks to have more creative expression than they were able to have before.
00:43:13.563 --> 00:43:16.943
It's making it easier for them to be expressive.
00:43:16.943 --> 00:43:22.146
So it is and there's no doubt it's going to impact jobs.
00:43:22.146 --> 00:43:38.344
But I do think it is also going to lead to an increase in creative expression for those folks that maybe aren't artists or aren't good writers or aren't good creators of music right, i think it's going to lead to an explosion in that as well.
00:43:39.576 --> 00:43:42.224
I wouldn't call that democratization of anything.
00:43:42.224 --> 00:43:45.143
I would call that the replacement of things.
00:43:45.143 --> 00:43:47.320
You're just choosing a different partner.
00:43:47.320 --> 00:43:58.579
Rather than partnering with an artist to help you create the thing that you want to create, you are partnering with a machine to create the thing you want to create.
00:43:58.579 --> 00:44:02.724
It's not like you are doing the creation at all.
00:44:02.724 --> 00:44:07.277
You are just partnering with a different source, and when you-, but is it part of that?
00:44:07.318 --> 00:44:17.942
the same thing that they said when Photoshop came in and the old school film photographers said you're not a real photographer, because real photographers use film.
00:44:19.579 --> 00:44:30.920
No, i think that what that's getting at is more about the difference between having a tool and having a solution.
00:44:30.920 --> 00:44:49.835
A tool is something that you can learn to use and learn to work with, and I am actually pro using AI as a tool to do things that are things that nobody wants to do, but-.
00:44:50.675 --> 00:44:51.449
Take out the garbage.
00:44:51.875 --> 00:45:06.420
Yeah, take out the garbage, But in terms of having it replace people who really do want to do, or using as an excuse to not pay people who are really good at what they do, that's terrible.
00:45:06.420 --> 00:45:15.943
I mean, devin, i doubt you would disagree that if the institution that you're working for said you know what you're teaching methods.
00:45:15.943 --> 00:45:20.123
they're good, but they're easily repeatable by an AI.
00:45:20.123 --> 00:45:29.284
We could do a computer generated version of you and just put out a bunch of videos and we understand that you've got tenure and everything.
00:45:29.284 --> 00:45:41.561
but we're good, we'll just use the teaching methods that you have and auto generate them and we're gonna send you out faster.
00:45:42.324 --> 00:45:42.483
Right.
00:45:42.864 --> 00:45:43.726
I wanna think a little bit.
00:45:43.726 --> 00:45:48.003
What Devin said there a couple of minutes ago, where he could barely draw a stick figure.
00:45:48.003 --> 00:45:49.219
I kind of fall into that too.
00:45:49.219 --> 00:45:56.860
And so, benjamin, you mentioned there, with things you replace, using AI to replace things that you don't want to do.
00:45:56.860 --> 00:46:01.003
For me, that would be some type of a graphic image.
00:46:01.003 --> 00:46:23.985
Now, i wouldn't expect it to be the quality of work that you do or that Randall's done for us, but in a pinch where I need just a simple header image, maybe for a webpage where it would take me 20 hours and a affinity design or something, to put something together that just looks atrocious because I'm not arguing that you should do it.
00:46:24.195 --> 00:46:28.726
I'm just advocating that you choose your partners wisely, right?
00:46:28.726 --> 00:46:41.905
Could you talk to a designer who you really like and who you could have a closer connection with, or could you have a machine who does not?
00:46:41.905 --> 00:46:43.541
there's other intrinsic values.
00:46:43.541 --> 00:46:48.842
So, for example, the person that you hired to create the album cover that you have now.
00:46:48.842 --> 00:46:51.302
You now have a closer connection with that person.
00:46:51.302 --> 00:46:56.043
That person now has more money that they're able to sustain themselves with.
00:46:56.043 --> 00:46:58.840
You're able to talk about it.
00:46:58.840 --> 00:47:02.418
You've talked about it nearly every episode since you put it up.
00:47:02.679 --> 00:47:03.101
Yeah, true.
00:47:03.695 --> 00:47:05.594
And so like it's providing all kinds of We do like it.
00:47:05.594 --> 00:47:06.478
We have stickers.
00:47:06.478 --> 00:47:17.403
It's providing all kinds of value that you were not able to actually have with the previous AI generated album cover.
00:47:17.403 --> 00:47:17.824
You know.
00:47:18.474 --> 00:47:20.400
Yeah, i don't think we would have created stickers out of that.
00:47:20.400 --> 00:47:23.018
You know, like that.
00:47:23.018 --> 00:47:25.626
You know, to be frank, that was shitty.
00:47:25.626 --> 00:47:29.923
You know, it was enough to get us a picture.
00:47:31.076 --> 00:47:33.202
Sure, but the partnerships is what I'm getting at.
00:47:33.202 --> 00:47:33.804
It was important.
00:47:33.943 --> 00:47:34.445
Yeah, no idea.
00:47:35.096 --> 00:47:42.599
The friendship that you have with this graphic designer and that year fostering that and and honestly talking about him.
00:47:42.599 --> 00:47:45.510
every show is great for him too.
00:47:46.072 --> 00:47:56.623
Yeah, and that's the intent of that as well is to hopefully you know, hopefully draw some people to Randall to get them to use him for their artwork.
00:47:56.623 --> 00:48:16.594
I think one of the things that I correlate what you're saying to Benjamin is, as at least two of you know I don't know if you know this Devon, but I do a lot of theater involved in, you know was, up until Tuesday, the president of the, the local community theater company that I'm a part of.
00:48:16.594 --> 00:48:20.494
I'm happy to say I'm retired, at least briefly, from the board for a year.
00:48:20.494 --> 00:48:25.271
Thank the little baby Jesus for that, the.
00:48:25.271 --> 00:48:37.704
But my favorite part of theater is the collab, is the is the actual creation of whatever it is that that we're working on.
00:48:37.704 --> 00:48:41.295
Do I love to be in a show up on stage and for an audience?
00:48:41.295 --> 00:49:00.719
Sure, but all those steps in between, from the sound design and lighting design to the you know the rehearsal process, to you know costuming, all those little pieces which you know, theater still may be one of the last outposts live music, maybe?
00:49:00.739 --> 00:49:01.402
the same thing.
00:49:01.402 --> 00:49:02.163
You know where you?
00:49:02.163 --> 00:49:19.280
you have a completely interactive process that involves a bunch of people each thinking create creatively to create one unique experience, and then you add an audience to that and every single shows a unique experience, you know, depending on how the audience responds, how the actors are responding on stage.
00:49:19.280 --> 00:49:32.538
I feel that, benjamin, i really do, i and I will say you're right with regard to Randall's creating of our, of our show artwork.
00:49:32.538 --> 00:49:37.331
That was, that's an important piece of this puzzle.
00:49:37.331 --> 00:49:54.896
We are, i think, tom I won't speak for you really proud of this, you know, and proud of you know sorry, tom, your droopy eyes and my balding head it's why we're not models, jeff.
00:49:57.221 --> 00:50:06.021
This isn't this isn't the first time that we, as humanity, have have cut corners in terms of of labor costs.
00:50:06.021 --> 00:50:12.342
We have a very storied history of doing that and it does not usually turn out well.
00:50:12.342 --> 00:50:18.572
We, you know, captured Africans and dragged them over to the Americas and had them pick up because we didn't want to do it.
00:50:18.572 --> 00:50:22.681
So you know, there's there's countless.
00:50:22.681 --> 00:50:31.672
The whole reason why we have the capitalist system that we we do is because people want to cut corners and would rather sit back and collect money rather than go out and make money.
00:50:31.672 --> 00:50:49.065
So I mean, it's it's not hard to draw parallels to seeing that, you know us being lazy and not paying people properly for really good work is not great for society, and so you know it's.
00:50:49.065 --> 00:51:05.876
It's not terribly difficult to see that the money that we're spending, that are going into the pockets of developers of chat, gpt and the, the AI generated artwork pieces, are not going into the pockets of the people.
00:51:05.876 --> 00:51:09.653
Who's with their source and the material from is kind of a problem.
00:51:10.114 --> 00:51:14.331
So I mean, yeah, sourcing materials a key feature there.
00:51:14.331 --> 00:51:15.954
And Tom, yeah, politics, baby.
00:51:15.954 --> 00:51:20.822
All right, we want to.
00:51:20.822 --> 00:51:22.050
We want to get close.
00:51:22.050 --> 00:51:31.139
To wrap this up, i know Tom's favorite subject is not talking about politics, but I'm down with you, i feel, i feel what you say.
00:51:31.922 --> 00:51:42.414
We, we, try to sit here on $2,000 Mac books, by the way yes, i mean, it's a risky thing to bring up we're, we're.
00:51:42.596 --> 00:51:46.530
We're all you know all about Mac products and we review them and use them.
00:51:46.530 --> 00:51:48.014
I am one of those users.
00:51:48.014 --> 00:52:06.824
I am a self-defined techie, but I do think that it's important to also think about the ethics of what we're doing and also how we're using these tools and whether or not, you know, we are furthering society towards moral degradation.
00:52:07.139 --> 00:52:25.737
Yeah, by the way, yeah, these, these products that are built, you know, in other countries with, yes, much, much lower price for labor battery components are hard to get yeah, yeah, they are, yeah, dude, they are mined up here in the subarctic where I'm currently.
00:52:25.757 --> 00:52:28.161
So yeah, yeah, yeah yeah there's.
00:52:29.130 --> 00:52:32.126
There's good in lots of things, there's bad in lots of things.
00:52:32.126 --> 00:52:33.693
But you've got to pay attention to both.
00:52:33.693 --> 00:52:34.775
I mean, you do.
00:52:34.775 --> 00:52:36.739
Yeah, it's not all good, it's not all bad.
00:52:36.739 --> 00:52:39.923
I mean, when we look at the March of this stuff.
00:52:39.923 --> 00:52:49.355
You know, we talked about the radio station you know 20, 30 minutes ago there, and I've got some history with that going back to my dumb teenage years.
00:52:49.355 --> 00:52:57.684
But even back then, like the trend even then was, you know, syndication was a thing right?
00:52:57.724 --> 00:53:01.461
Casey Kasem's American Top 40, right, and so there was a local person.
00:53:01.483 --> 00:53:16.387
You're old, i just want to point that out saying Casey, no shit, come on this yeah but, um, you know, and with that, and so if you're playing that, here's one, maybe you guys will remember dr Demento.
00:53:16.387 --> 00:53:17.311
Oh yeah, of course.
00:53:17.612 --> 00:53:23.512
Yeah, yeah, see, jeff's old too shut up but I mean, i mean, you're absolutely right.
00:53:23.512 --> 00:53:27.010
Casey Kasem and the syndication put a lot of redo DJs out of business.
00:53:27.010 --> 00:53:27.990
Yeah, a lot of it.
00:53:28.079 --> 00:53:38.418
Especially they used to do the top 10 and they were replaced by him, yeah right and as soon as you could, so as soon as they could deliver that digitally.
00:53:38.418 --> 00:53:45.019
It was even more of an impact because at a time they would send it to you, like King Biscuit flower and all those things.
00:53:45.019 --> 00:53:46.291
They would send it to you, which was weird.
00:53:46.291 --> 00:53:49.052
I don't know how they press so many albums and I still have some in my garage.
00:53:49.052 --> 00:53:53.115
Yeah, i want to check them out, but they would send you the album and you would play it.
00:53:53.115 --> 00:53:56.273
So there have to be somebody there for that and then you play.
00:53:56.313 --> 00:53:57.958
Your commercials is when they had their breaks.
00:53:57.958 --> 00:54:05.960
Then you start the record again and it would start playing the next segment, but once, like kind of for sure, with satellite initially.
00:54:05.960 --> 00:54:14.519
Once that kicked in, that took out the need for the person to sit there and do it, because then it was digital and then you could have a system play the commercials when you needed it.
00:54:14.519 --> 00:54:16.974
So there's more automation, taking people out of the picture.
00:54:16.974 --> 00:54:32.956
Radio got deregulated and so a lot of the mom and pops were picked up by Citadel and Clear Channel and and all those, and then you would have one guy in Boston who would do the voice tracks for kind of like what you're talking about, benjamin.
00:54:33.197 --> 00:54:45.753
Jack right yep, he'd do the voice tracks and amount to Albuquerque, santa Fe, spokane, wherever he's at, and then they would just dump those into the, the digital system that they have right.
00:54:45.753 --> 00:54:49.992
So this, this stuff's been going on and you know, you guys have probably seen the meme.
00:54:49.992 --> 00:54:53.059
Like for Excel, they have the picture of the.
00:54:53.059 --> 00:54:55.193
Looks like it's in the 40s or 50s.
00:54:55.193 --> 00:55:18.364
For all I know it's an AI generated picture, who knows, but it's a bunch of people working in, like the accounting department, and they're all head down on the ledger right and they're all doing and they're like Excel, replace this whole wing of the building right, but the question is is whether or not those people wanted to have those jobs or not.
00:55:18.726 --> 00:55:19.590
You know right where's it.
00:55:19.590 --> 00:55:20.371
Was it hell?
00:55:20.371 --> 00:55:25.815
hard to answer right, which is hard to answer, but I don't think it is very hard to answer for creatives.
00:55:25.815 --> 00:55:28.873
I think creatives typically love the work that they're doing.
00:55:28.873 --> 00:55:29.856
That's why they get into it.
00:55:29.856 --> 00:55:30.922
It's a risky field.
00:55:30.922 --> 00:55:43.594
Being a creative is is very, very difficult to sustain yourself, much more than you know being a doctor or lawyer or anything else that you know you try being although that's changing that.
00:55:44.056 --> 00:55:50.097
You know those, those fields are are already risky and it's almost unanimously.
00:55:50.097 --> 00:55:52.288
All the creators are saying no, we love doing this.
00:55:52.288 --> 00:55:53.414
Don't take this away from us.
00:55:53.414 --> 00:55:54.197
Where?
00:55:54.257 --> 00:55:56.168
are there any pieces of that word you don't like?
00:55:56.168 --> 00:55:56.228
oh?
00:55:56.228 --> 00:56:01.668
sure then that you might consider using that to like say I hate this part.
00:56:01.708 --> 00:56:31.422
Let it do it absolutely, and and there are ways that I use those tools to do those things so like, for example, i used to be a pretty avid wedding photographer and I used to do a lot of that work and I would use a Adobe Lightroom to edit the first photo And then I would take all the settings from that editing and apply them to all of the rest of the photos, all of them.
00:56:31.422 --> 00:56:39.606
There were 50, 50 photos all taking the exact same lighting in the exact same place, and I'd be able to just and edit them all at once.
00:56:39.606 --> 00:57:03.235
And you know that's a form of AI, it's an algorithm, but I don't know a whole lot of people that want to sit there and edit a photo one by one and do all of the individual sliders, you know, until it's absolutely perfect, when it's just the same as the previous photo.
00:57:03.235 --> 00:57:08.867
That's my point Don't take away stuff that people actually want to do.
00:57:09.789 --> 00:57:32.606
You know, if they're out of that 100 number of accountants that were sitting in that room, if one or two of them really loved what they were doing and the rest of them were like no, this is hell, give those one or two the Excel document and let them run it from there, and that's what we have done.
00:57:32.606 --> 00:57:40.452
You know, the people in accounting departments now are people who love Excel and love what it does, and they're total nerds about it.
00:57:40.452 --> 00:57:41.615
And more power to them.
00:57:41.615 --> 00:57:52.235
I'm not one of those people, but you know it's a way to let people continue to do what they're passionate about and also pay them properly.
00:57:52.235 --> 00:58:00.081
You know that's an important part of this is to not cut corners where you're just doing it the same as Buck, yeah.
00:58:00.240 --> 00:58:09.559
And I heard and, Jeff, I think I brought this up when we talked a little bit about this with Kirk Mechelhorn on the classical show.
00:58:10.041 --> 00:58:10.342
Yeah.
00:58:10.664 --> 00:58:28.324
Yeah, cal Newport's done some pretty good thinking on this stuff and he's written some books and he's a computer science professor at Georgetown And I think it was on the focused podcast that I heard him talk about this Like what's it all going to mean for jobs?
00:58:28.324 --> 00:58:32.853
right, and his take at least.
00:58:32.853 --> 00:58:36.559
Then and it's paraphrased so there's some nuance around it I'm probably missing.
00:58:36.559 --> 00:58:48.074
But he said that he does expect there will definitely be some contraction, which I think the four of us probably kind of agree with.
00:58:48.074 --> 00:58:53.003
But he also thought that's an immediate thing.
00:58:53.003 --> 00:59:03.614
But then as we kind of dip out of that, it's going to bring with it some type of a recreation that will bounce things back.
00:59:06.103 --> 00:59:07.733
I would encourage you to check out this stuff.
00:59:08.297 --> 00:59:08.478
Yeah.
00:59:08.478 --> 00:59:11.559
Just to get the details because I didn't give it justice.
00:59:11.559 --> 00:59:29.728
But I think it's, you know, but it's just like we were talking, though, with that room full of the ledger workers and everything, and with Excel, and I think that's a good example though, because you know, i know, devin we use it a lot at the office more, i think, google Sheets for collaborative stuff, but same principles.
00:59:30.809 --> 00:59:48.094
that has brought a lot of that to just you know ordinary people, because you don't really need the accounting I guess, maybe hardcore accounting background to put some stuff into a sheet and run some formulas and then look up some formulas on YouTube how to use them and then drop them in.
00:59:48.094 --> 01:00:04.177
So I like to be cautiously optimistic but at the same time I think that needs to be balanced with the healthy death dose of skepticism that you know, because there are people who will take advantage of other people.
01:00:05.057 --> 01:00:05.298
Always.
01:00:05.298 --> 01:00:06.559
Yeah, i think that's the.
01:00:06.559 --> 01:00:08.324
That's nothing new, but it's just.
01:00:08.403 --> 01:00:10.288
You know, it's just the way the world goes.
01:00:11.811 --> 01:00:12.172
Cool, cool.
01:00:12.172 --> 01:00:15.802
All right, we've burned another hour of your time, we're happy to say.
01:00:15.802 --> 01:00:19.050
But, benjamin, i think you gave us a good last word.
01:00:19.050 --> 01:00:23.106
Devin want to get a last word from you, last thoughts.
01:00:23.106 --> 01:00:25.751
No pressure, i saw your eyes go wide, okay.
01:00:28.405 --> 01:00:29.146
This is not your Mac.
01:00:29.146 --> 01:00:33.746
As I hold up my iPhone, my AirPods, my Apple Watch right.
01:00:36.621 --> 01:00:38.114
I think that's the other side of this, too right.
01:00:38.114 --> 01:00:44.472
I think that it should be mentioned that one of the reasons why we buy these machines is because they last a long time.
01:00:44.472 --> 01:01:13.373
You know, the the the Apple devices that we purchase are incredibly long lasting, and I don't know anybody who has a 10 year old Dell, but I do have a 10 year old Mac sitting in my basement which runs my house and does all of the home automation stuff you know, and so I think that's also that's also an important part of of being an Apple user is that we we don't want to throw things away every two years.
01:01:13.393 --> 01:01:15.804
So Except for your phone.
01:01:16.699 --> 01:01:21.746
What a new phone every two years Not me, i still got an older phone.
01:01:21.746 --> 01:01:23.965
Only upgrade when the camera does.
01:01:23.965 --> 01:01:25.088
Yeah.
01:01:25.920 --> 01:01:28.146
All right, devin final, final thoughts.
01:01:30.222 --> 01:01:31.766
Now, this was a great conversation.
01:01:31.766 --> 01:01:36.539
I'm really looking forward to what this brings.
01:01:36.539 --> 01:01:42.559
It's going to be a really interesting, i think, year, with 2024 elections coming up.
01:01:42.559 --> 01:01:54.559
We're going to see some really interesting controversial use cases of this technology, whether that be deep fakes, we're already imagery and videos from some campaigns.
01:01:55.302 --> 01:02:15.559
And I think you know we are just starting this conversation, so you know we live in exciting times And this is certainly going to be a tumultuous technology And so, yeah, i'm somewhat excited, a little hesitant right, a little scared, a little nervous all the emotions about this thing.
01:02:15.559 --> 01:02:19.088
But, yeah, great talking with you guys today about it.
01:02:19.588 --> 01:02:20.530
Likewise Thank you, devin.
01:02:20.530 --> 01:02:21.900
Thank you, benjamin, devin.
01:02:21.900 --> 01:02:25.414
if people want to find you on the internet, send you a message someplace.
01:02:25.414 --> 01:02:26.539
where can they find you?
01:02:27.041 --> 01:02:33.559
Yeah, i am Devin Taylor WV on Twitter and LinkedIn, so those are my two main social medias of choice.
01:02:34.489 --> 01:02:35.559
Awesome And Benjamin.
01:02:36.523 --> 01:02:41.460
So I, like I said, I run a small company called Zerflin Z-E-R-F-L-I-N.
01:02:41.460 --> 01:02:44.559
You will find it almost everywhere.
01:02:44.559 --> 01:02:46.338
You can also find me.
01:02:46.338 --> 01:02:50.559
My last name is pronounced Young Savage, but it's not spelled that way.
01:02:50.559 --> 01:02:58.954
But if you search for Benjamin Young Savage, you will ultimately find me and find out that I am on every social media network known to man.
01:03:01.045 --> 01:03:01.525
Excellent.
01:03:01.525 --> 01:03:05.929
And, tom, where can we find you?
01:03:06.920 --> 01:03:07.704
I'm going to go to bed.
01:03:09.342 --> 01:03:12.005
Great, you want people to find you there, tom.
01:03:13.905 --> 01:03:14.668
No leave me alone.
01:03:14.668 --> 01:03:18.047
I'm on anti-social media.
01:03:18.699 --> 01:03:23.324
Yeah, Tom F Anderson on Twitter, TomFAndersoncom on the website And guys.
01:03:23.324 --> 01:03:23.947
Thanks for coming on.
01:03:23.947 --> 01:03:24.590
We appreciate it.
01:03:24.590 --> 01:03:25.876
It's been a good discussion.
01:03:25.876 --> 01:03:33.367
Truly, And if possible, maybe in another six to 12 months, is this thing kind of keeps chugging along.
01:03:33.367 --> 01:03:40.376
We can get you back on and we can reevaluate and see what a total shit show it's become or if it's balanced out a little bit, or what.
01:03:41.021 --> 01:03:43.708
Maybe after the election, so we can talk about your favorite topic, tom.
01:03:46.545 --> 01:03:47.929
I think I got some vacation coming up.
01:03:47.929 --> 01:03:50.498
It's going to be my show.
01:03:50.639 --> 01:03:56.184
I'll be in my room With a blanket over his head.
01:03:56.184 --> 01:03:57.047
his eyes closed.
01:03:58.579 --> 01:03:59.061
Rock baby.
01:03:59.061 --> 01:04:01.942
Jesus, call me home No.
01:04:02.523 --> 01:04:03.748
I had a blast coming on here.
01:04:03.748 --> 01:04:05.456
I would love to be back on again.
01:04:06.018 --> 01:04:06.559
Really appreciate it.
01:04:07.262 --> 01:04:07.744
I love to have you.
01:04:08.880 --> 01:04:09.523
I appreciate that.
01:04:09.523 --> 01:04:11.784
Thank you Be sure to tell all your friends about it.
01:04:11.784 --> 01:04:17.519
As usual, you can get to us at feedback at basicafshowcom.
01:04:17.519 --> 01:04:19.547
We do have stickers, we do have magnets.
01:04:19.547 --> 01:04:20.925
Have we given them all away, tom?
01:04:21.365 --> 01:04:21.445
No.
01:04:23.161 --> 01:04:26.771
Come on People, you know we're giving away free stuff, Take it.
01:04:27.559 --> 01:04:33.492
Also, Jeff, we've maintained a streak of one rating in Apple podcasts per show.
01:04:33.492 --> 01:04:38.070
So if somebody wants to give us a rating that'll keep that streak going.
01:04:38.882 --> 01:04:40.128
So we've done 11 shows.
01:04:41.061 --> 01:04:42.704
We don't care, we just want you to notice us.
01:04:42.704 --> 01:04:47.005
We're all about the noticing.
01:04:47.005 --> 01:04:51.331
If you call us a bunch of dirtbags, we don't care, because we already know that that's true.
01:04:51.331 --> 01:04:57.612
The website basicafshowcom.
01:04:57.612 --> 01:05:01.559
You can find me at Reyes Point on pretty much everywhere.
01:05:01.559 --> 01:05:05.653
I am on LinkedIn these days Looking for work.
01:05:05.653 --> 01:05:09.052
If anybody wants to pass any, i'm working, but happy to have more.
01:05:09.052 --> 01:05:09.835
What?
01:05:09.856 --> 01:05:13.407
do, you do again You've got to let us know so we can replace you with AI.
01:05:13.969 --> 01:05:23.393
Oh good, yeah, that's something that's most likely going to be replaced with AI, which is going to be I do, it consulting.
01:05:23.393 --> 01:05:30.126
So a wide variety of things Project management you name it Basically anything from project management down.
01:05:30.126 --> 01:05:34.347
I'll be your director of IT if you want me to, would be good at that as well.
01:05:34.347 --> 01:05:42.911
Theme music, as always by Psychokinetics, and you'll find links to Psychokinetics in the show notes.
01:05:42.911 --> 01:05:49.559
As I said before, celsius 7, frontman, one of the front men for Psychokinetics, has a new album out, which we'll have a link to as well.
01:05:50.081 --> 01:05:57.559
Podcast artwork by Randall Martin Design, which, yes, we are very pleased that we have real artwork by a real human being.
01:05:57.559 --> 01:06:09.474
You should check out Randall's work, and we will also have links to Benjamin's work, the Zerflin website and also his personal links in the show notes as well.
01:06:09.474 --> 01:06:21.110
This particular set of show notes going to have a zillion AI related articles that we think are valuable Way more than we could have gotten into in any great detail, although I think we covered it well here.
01:06:21.110 --> 01:06:26.606
So thanks for listening And see ya.
01:06:26.606 --> 01:06:28.320
Right, tom Right.
Photo / Graphic / Artist
I’m a photographer, graphic artist, illustrator, and independent abolitionist.
I grew up in the subarctic wilderness of Northern Quebec.
I make artwork.
I founded a design company called Zerflin. We champion underdog clients and believe that running a company without being evil is paramount. We’re reputable, honest and complete geeks about design and making things look good.
In my spare time, I love riding motorcycles, exploring abandoned ruins, and dismantling white supremacy.
I’ll be your digital maverick, hired gun, and best friend.
CIO @ Shenandoah University
Devon is an avid technology enthusiast with a passion for exploring software, apps, and the broader technological landscape. He currently serves as the Associate Vice President and Chief Information Officer at Shenandoah University. With a vibrant career that spans across multiple domains of technology and education, he is also an adjunct professor, teaching technology within the Business School and School of Education. His work often focuses on the intersection of technology and education, including the impactful rise of artificial intelligence in both sectors.
Here are some popular episodes to get you started! 👇