WEBVTT 00:00.031 --> 00:11.903 [JM]: So earlier today, I'm getting some things done, and suddenly I see the lights flicker in my office, and a continuous screeching tone comes from underneath my desk. 00:11.923 --> 00:13.447 [JM]: Oh, and my screen goes off. 00:13.467 --> 00:16.335 [DJ]: So you were probably worried there was a raccoon down there. 00:16.822 --> 00:21.030 [JM]: Never seen a raccoon around these parts, so I feel pretty confident that's not what happened. 00:21.491 --> 00:33.955 [JM]: But I was trying to figure out what did happen, and I think I quickly realized that there was just an infinitesimally small power blip, where it just was enough to make the lights flicker. 00:34.456 --> 00:36.720 [JM]: Lights were still on, but it was just enough... 00:36.700 --> 00:42.932 [JM]: to trigger my uninterruptible power supply, or UPS, underneath my desk. 00:42.952 --> 00:56.076 [JM]: And the reason it was screeching at me, I figured out is that the connected devices were drawing too much power, which is why suddenly my screen was not lit because the Mac Studio connected to the screen was not on. 00:56.056 --> 01:00.603 [JM]: So the chain of events was power goes out for probably less than a second. 01:01.083 --> 01:04.949 [JM]: UPS can't handle the load of the devices connected to it. 01:04.969 --> 01:08.314 [JM]: So power just gets cut off to all of those devices. 01:08.334 --> 01:15.685 [JM]: So not only is my Mac Studio off, but now the router is off and the fiber optic modem and anything else that was connected to it. 01:15.665 --> 01:19.029 [JM]: And it got me thinking like, wait a minute, this didn't happen before. 01:19.049 --> 01:20.811 [JM]: Like, why is it a problem now? 01:21.452 --> 01:25.817 [JM]: And I'm also asking, okay, well, what devices are connected to this thing? 01:25.837 --> 01:27.919 [JM]: Because whatever they are, they worked fine before. 01:28.400 --> 01:32.765 [JM]: And I'm looking at it, I'm like, okay, well, my Apple Studio Display is connected to it. 01:32.905 --> 01:36.429 [JM]: That's got to be a pretty significant draw on power. 01:36.890 --> 01:38.331 [JM]: But again, it worked fine in the past. 01:38.351 --> 01:38.972 [JM]: There's the 01:38.952 --> 01:41.555 [JM]: fiber optic modem, there's the router. 01:42.256 --> 01:49.423 [JM]: Oh, right, the computer — because before I had a MacBook Pro connected to this whole setup, which doesn't draw as much power. 01:49.944 --> 01:51.626 [JM]: Oh, and also has its own battery. 01:52.026 --> 02:01.737 [JM]: So I have no idea how much power this beast of an M3 studio draws with its 28 cores and 60 GPU cores and 02:01.717 --> 02:18.960 [JM]: 96 gigabytes of RAM, I'm sure it draws more power, I'm guessing, then this thing can handle even if I were to unplug all of the other devices, which at this point, I have pared down to just the fiber optic modem, and the router and the Mac Studio. 02:18.980 --> 02:26.830 [JM]: So my guess is, even if I just connected the Mac Studio and pulled the power to the UPS of the wall, I'd probably get the same thing. 02:26.810 --> 02:30.999 [JM]: Loss of power to the Mac studio, screeching noise from the UPS. 02:31.019 --> 02:34.968 [JM]: And so I think the conclusion here is I need to get a new UPS. 02:35.529 --> 02:44.329 [DJ]: I suppose we can't expect, even though the name is uninterruptible power supply, we can't expect the UPS to operate 02:44.309 --> 02:47.857 [DJ]: when there's more draw than it was designed to withstand. 02:47.877 --> 03:03.212 [DJ]: But the screeching noise it makes, you know, is just so unpleasant that I almost wish there was a complementary product called an unflappable power supply, where like it just wouldn't lose its composure in the same way when these demands are put on it. 03:03.192 --> 03:05.335 [JM]: Yeah, and I also wish that the software were better. 03:05.836 --> 03:11.723 [JM]: Like, I am just frustrated by the inability for me to just see, right? 03:11.763 --> 03:17.311 [JM]: Like, there should be some system settings panel that I can bring up that says, okay, this is your current draw. 03:17.792 --> 03:20.315 [JM]: And this is how much your UPS is rated to support. 03:20.695 --> 03:21.777 [JM]: So guess what? 03:21.817 --> 03:23.059 [JM]: You don't need to test it. 03:23.079 --> 03:25.322 [JM]: You don't need to pull the power out of the wall. 03:25.342 --> 03:27.464 [JM]: We can tell you right now, you don't have enough. 03:27.505 --> 03:29.247 [JM]: And this is how much you would need. 03:29.387 --> 03:31.630 [JM]: Like, that would be a handy thing to be able to determine 03:31.610 --> 03:34.276 [JM]: without having to figure out how to determine it. 03:34.476 --> 03:36.260 [JM]: Like I would just wish that were built in. 03:36.280 --> 03:42.513 [JM]: I imagine there's probably some UPS maker that has software that would answer this question. 03:42.573 --> 03:47.965 [JM]: But the thing that I got off of Amazon, because where I live in this moment, 03:47.945 --> 03:52.972 [JM]: the brand that I would normally use, the brand that I used when I was in the States was called CyberPower. 03:53.753 --> 03:58.619 [JM]: And I feel like that was a higher-quality product than the Epyc. 03:58.699 --> 04:03.226 [JM]: I think it's E-P-Y-C, Epyc Ion, something like that. 04:03.266 --> 04:06.310 [JM]: Some brand I'd never heard of, but it was, I didn't have great choices here. 04:06.370 --> 04:09.013 [JM]: So I chose what I could find. 04:09.494 --> 04:13.322 [DJ]: Look, I know it's spelled wrong, but I didn't have any choice. 04:13.783 --> 04:21.318 [JM]: Yeah, I need to crowdsource and ask folks like, hey, all right, in this neck of the world, what are the best available UPS devices? 04:21.398 --> 04:28.212 [JM]: And just replace this underpowered and fairly bare bones things under my desk. 04:28.192 --> 04:45.994 [DJ]: As a side note, I have two CyberPower UPSs, which I purchased when I first started working from home on a regular basis, because I think the first time I had a power outage and realized like, oh, I just can't do my job anymore. 04:46.555 --> 04:48.157 [DJ]: This is not really going to fly. 04:48.577 --> 04:56.207 [DJ]: I think I might have even gotten a tax credit for them as business expenses, because it was like, no, I need these to function, essentially. 04:56.440 --> 05:00.244 [JM]: Do you know how many volt amperes your UPS supports? 05:00.605 --> 05:02.307 [DJ]: 750, of course. 05:02.327 --> 05:03.608 [DJ]: Are you saying you don't know that? 05:03.928 --> 05:10.175 [DJ]: Well, I can see from where I'm sitting what it says on my UPS, so that's how much. 05:10.215 --> 05:14.240 [DJ]: But to your point, I don't know how much of that I'm actually using. 05:14.260 --> 05:14.500 [JM]: Right. 05:14.840 --> 05:19.225 [JM]: Well, I know I'm using more than 800, which is what this thing is currently rated for. 05:20.587 --> 05:20.727 [DJ]: Mm-hmm. 05:21.027 --> 05:28.622 [JM]: Yeah, it's just funny being in a world where suddenly when power goes away, even for a moment, whatever I'm doing disappears, right? 05:28.763 --> 05:41.348 [JM]: It's like I'm so used to using a MacBook Pro or something with an integrated battery that this is the first time in a long time I've been in a situation where if power disappears for even a blip, bye-bye. 05:41.328 --> 05:42.129 [DJ]: Yeah, you're right. 05:42.149 --> 05:46.254 [DJ]: That is an unusual situation in this day and age. 05:46.775 --> 05:59.631 [DJ]: Actually, on the desk behind me right now, I have an old, old iMac G4 that I got off eBay a couple years ago, mostly just because I've always loved the way that weird desk lamp-like computer looks. 05:59.771 --> 06:01.834 [DJ]: I got it up and running again recently. 06:01.814 --> 06:04.139 [DJ]: And among other things, it doesn't have a battery. 06:04.620 --> 06:07.907 [DJ]: So if you interrupt the power, it just shuts off. 06:07.927 --> 06:16.985 [DJ]: Another thing, on the same token of the way computers used to be, is I got a classic Mac OS 9 running on it. 06:17.006 --> 06:20.633 [DJ]: And what I totally forgot about is those old operating systems 06:20.613 --> 06:33.390 [DJ]: did not have – I think it's preemptive multitasking is the main feature or process isolation, so that like when you're running a single application and it crashes, your whole computer goes down. 06:33.710 --> 06:37.195 [DJ]: There's no like, oh, just we'll force quit that and just keep going. 06:37.215 --> 06:38.496 [DJ]: Like, no, no, no, no, no. 06:38.596 --> 06:40.339 [DJ]: You're restarting at this point. 06:40.559 --> 06:42.221 [DJ]: Everything is gone. 06:42.241 --> 06:43.943 [DJ]: So that was a rude awakening. 06:44.124 --> 06:48.569 [DJ]: But it did throw me back in time because like, yeah, computers used to work that way. 06:49.030 --> 06:51.494 [DJ]: If anything went wrong, everything went wrong. 06:51.734 --> 06:52.936 [JM]: Those were the good old days. 06:52.956 --> 06:58.544 [JM]: And by good, I mean maybe not so good, but still fun to remember. 06:59.064 --> 07:02.569 [DJ]: Somehow I played TIE Fighter for 250 hours anyway. 07:02.589 --> 07:05.934 [DJ]: So, you know, I had persistent resilience. 07:06.415 --> 07:10.060 [DJ]: You had to, to be a computer user in the 1990s. 07:10.158 --> 07:11.440 [JM]: Okay, moving on. 07:11.460 --> 07:14.486 [JM]: I was publishing an episode today. 07:14.506 --> 07:16.690 [JM]: And when doing that, I published it. 07:17.231 --> 07:26.108 [JM]: And in that episode, we talked about how managers at some companies are mandating use of LLMs among employees in order to 07:26.088 --> 07:32.781 [JM]: to, I don't know, I guess, improve productivity, whatever the reasoning is, it is now something that some employees don't have any choice about anymore. 07:33.182 --> 07:35.567 [JM]: And I published this episode in which we talked about this. 07:35.587 --> 07:39.554 [JM]: And mere moments later, I came across a post on the Fediverse. 07:40.116 --> 07:45.987 [JM]: And I'm going to read part of this just because it was so very timely, given that we just talked about it. 07:45.967 --> 07:51.273 [JM]: This person says, I am now being required by my day job to use an AI assistant to write code. 07:51.613 --> 07:59.161 [JM]: I have also been informed that my usage of AI assistants will be monitored and decisions about my career will be based on those metrics. 07:59.181 --> 08:04.247 [JM]: Now, right there, like we're already talking about this same thing that we talked about before, right? 08:04.307 --> 08:09.072 [JM]: Where it's crazy to me that this is just a 08:09.052 --> 08:11.857 [JM]: non-negotiable, this is what you're doing. 08:11.877 --> 08:20.634 [JM]: And if you aren't using these tools, then it's going to potentially affect your performance reviews and your continued employment. 08:20.654 --> 08:25.423 [JM]: So right there, to me, is already just a very interesting point by itself. 08:25.403 --> 08:28.689 [JM]: This person goes on to describe their experience with it. 08:29.270 --> 08:37.244 [JM]: And whether you agree with them or not, to me is somewhat separate from this person being required to use these tools. 08:37.264 --> 08:40.910 [JM]: This is what this person's impression was after having used it. 08:40.930 --> 08:42.653 [JM]: I gave it an honest shot today. 08:42.633 --> 08:44.695 [JM]: using it as responsibly as I know how. 08:45.156 --> 08:49.440 [JM]: Only use it for stuff I already know how to do so that I can easily verify its output. 08:49.820 --> 08:59.630 [JM]: That part went okay, though I found it much harder to context switch between thinking about code structure and trying to herd a BS generator into writing correct code. 09:00.030 --> 09:05.696 [JM]: One thing I didn't expect though, is how bleeping disruptive its suggestion feature would be. 09:05.716 --> 09:12.623 [JM]: It's like trying to compose a symphony while someone is relentlessly playing a kazoo in your ear. 09:12.603 --> 09:13.505 [JM]: Sorry. 09:14.146 --> 09:18.834 [DJ]: The reference to kazoo, the funniest named instrument, got me. 09:20.236 --> 09:24.223 [DJ]: I wanted to read this just really for that one sentence. 09:24.243 --> 09:32.417 [DJ]: And also because I'm just imagining the person with a kazoo with like a really intense look on their face as they're like... 09:33.410 --> 09:33.991 [DJ]: Oh, man. 09:34.011 --> 09:35.172 [DJ]: I love that so much. 09:35.252 --> 09:36.934 [DJ]: Thank you, random internet person. 09:36.974 --> 09:37.295 [DJ]: Right. 09:37.415 --> 09:40.398 [DJ]: For that mental image. 09:40.418 --> 09:40.879 [DJ]: So good. 09:41.359 --> 09:52.773 [JM]: So this person goes on to say, it flustered me really quickly to the point where I wasn't able to figure out how to turn that quote feature off. 09:52.793 --> 09:56.798 [JM]: I am noticing physical symptoms of an anxiety attack as a result. 09:56.818 --> 09:59.921 [JM]: I stopped work early when I noticed I was completely spent. 09:59.941 --> 10:03.045 [JM]: I don't know if I wrote more code today than I would have normally. 10:03.025 --> 10:10.804 [JM]: I don't think I wrote better code as the vigilance required is extremely hard for my particular brand of neurospicy to maintain. 10:11.265 --> 10:14.894 [JM]: And then this person ends it with, I don't think we were meant to live like this. 10:14.914 --> 10:15.796 [JM]: Neurospicy. 10:15.816 --> 10:16.318 [JM]: I like that one. 10:16.358 --> 10:18.503 [JM]: That's a, that's a nice, nice way to describe that. 10:18.483 --> 10:19.724 [DJ]: You haven't come across that? 10:19.885 --> 10:20.986 [DJ]: No, that's a new term for me. 10:21.026 --> 10:23.869 [DJ]: Have you been on the internet at all in the last five years? 10:24.290 --> 10:26.372 [DJ]: Man, I have so many thoughts about this. 10:26.392 --> 10:28.795 [DJ]: I can divide my thoughts into two sections. 10:28.835 --> 10:35.402 [DJ]: And one of them is I could share my own experiences doing likewise with AI assisted code generation. 10:35.462 --> 10:48.237 [DJ]: But the first thing that jumps out at me is the incredible failure of leadership that is implied by this person's employer forcing them into this 10:48.217 --> 11:00.947 [DJ]: Like it sounds super hyperbolic when they describe the symptoms experiencing the symptoms of an anxiety attack from just being annoyed by AI autocomplete. 11:01.248 --> 11:06.681 [DJ]: However, when you combine I'm having a really hard time using this with the. 11:06.661 --> 11:11.868 [DJ]: fear that's been put inside them that like, and if I don't figure this out, I'm going to get fired. 11:11.888 --> 11:13.951 [DJ]: I am literally at a loss for words. 11:13.971 --> 11:35.200 [DJ]: Like it's, it's incredibly upsetting to think that like what seems so tone deaf about trying to force your employees to use AI assistance this way is where do we get this notion that these tools are so clearly well aligned with people's work style that you can just mandate their use, right? 11:35.180 --> 11:40.749 [DJ]: I think that's what it comes down to because let's consider other tools that you use for programming. 11:41.129 --> 11:55.973 [DJ]: Most of us who program professionally use some form of integrated development environment, which is a text editor that has a bunch of other tools built into it that make it easy to write, test, build your code quickly. 11:55.953 --> 11:58.477 [DJ]: And there are lots of different ways of achieving that. 11:58.498 --> 12:10.298 [DJ]: Most people in their jobs don't write software using the simplest text editor built into their operating system like the same way you would compose an email. 12:10.599 --> 12:20.256 [DJ]: You usually have some kind of tool chain on top of that, especially in a professional setting where like a whole team of people with varied experience have to collaborate with each other. 12:20.236 --> 12:26.991 [DJ]: And in a setting like that, it makes sense that those sorts of tools would be required, at least to some extent. 12:27.352 --> 12:31.220 [DJ]: The place where I work, we all use tool chains of various kinds. 12:31.260 --> 12:35.690 [DJ]: We don't actually insist that developers use one tool or another. 12:35.770 --> 12:37.654 [DJ]: Most people use one thing. 12:37.634 --> 12:43.744 [DJ]: Some of us use a different app to do our code and like no one cares as long as the work is getting done, right? 12:44.145 --> 12:45.046 [DJ]: That's the bottom line. 12:45.367 --> 12:57.387 [DJ]: But I will grant you that if someone was hired and they insisted on writing all of their code down on paper to use it, I guess, intentionally sort of ridiculous example, they would be required to work differently. 12:57.407 --> 13:01.173 [DJ]: Like if someone said, well, I'm just, I'm not comfortable using a computer to write my code. 13:01.194 --> 13:03.878 [DJ]: It would be like, well, sorry that you have to. 13:03.858 --> 13:06.362 [DJ]: Like I don't think there'd be any movement on that. 13:06.382 --> 13:19.844 [DJ]: But we can't pretend that these AI-assisted tools are so ubiquitous and well-proven and everyone can be expected to have a certain comfort level with them that it's appropriate to mandate them like that. 13:19.965 --> 13:21.687 [DJ]: That's where this really breaks down for me. 13:21.708 --> 13:24.432 [DJ]: Like I think I probably do not share – 13:24.412 --> 13:31.747 [DJ]: all of the feelings this person evidently has in their post about AI assistance tools in general. 13:31.767 --> 13:40.845 [DJ]: The fact that they just up front describe it as a BS generator tells you a lot about how they feel about this thing's fundamental utility. 13:41.326 --> 13:42.589 [DJ]: But regardless... 13:42.569 --> 13:57.573 [DJ]: The notion that the right thing to do as like whoever that person's employer is is to like browbeat this person and cause them anxiety by trying to force them to use this tool that they're clearly not comfortable with. 13:57.914 --> 13:59.536 [DJ]: There's no justification for that. 13:59.757 --> 14:01.219 [DJ]: That's just bad leadership. 14:01.334 --> 14:03.898 [JM]: Yeah, this whole topic just raises so many questions, right? 14:04.259 --> 14:13.854 [JM]: For example, on average, are most software developers' productivity positively or negatively impacted by this kind of mandated tool use? 14:14.194 --> 14:14.916 [JM]: Or is it a wash? 14:15.396 --> 14:16.678 [JM]: Those are the only three options, right? 14:17.279 --> 14:27.275 [JM]: And there's at least some evidence based on, I think, a limited data size that in a recent study I read about that productivity went down and not up. 14:27.796 --> 14:28.337 [JM]: And 14:28.317 --> 14:38.555 [JM]: I'm sure there's going to be plenty of more studies and we'll see over time whether these tools on average tends to result in productivity gains or not. 14:38.575 --> 14:44.605 [JM]: But I would argue, like, let's just say that on average, the productivity gain is, I don't know, 10%. 14:44.585 --> 14:47.829 [JM]: If you're a manager, you might think, okay, hey, 10%. 14:48.490 --> 14:50.753 [JM]: If that were a stock market return, you'd be thrilled. 14:51.054 --> 14:54.499 [JM]: So that's a win, we should definitely mandate that across the entire enterprise. 14:54.859 --> 15:07.977 [JM]: But what if a side effect of using this tool is that a very significant proportion of your employees are miserable, like they really just do not like it, they are not having fun, they dread getting up in the morning and going to work. 15:07.957 --> 15:19.059 [JM]: Because if that is a side effect for a significant number of your employees, then I don't think that 10% is going to be the win you think it is, because you're going to have a higher percentage of turnover. 15:19.580 --> 15:23.367 [JM]: And that's going to cost you way, way more than whatever 15:23.347 --> 15:25.210 [JM]: meager productivity gain you might get. 15:25.631 --> 15:28.697 [JM]: So I'm not saying that I know the answers to these questions. 15:28.717 --> 15:39.938 [JM]: But I do think that something that can't just be hand waved away is whether or not people enjoy using these tools, whether or not it makes them happy to use them. 15:40.036 --> 15:47.067 [JM]: relative to how much more productivity they get from it or the company on the whole gets from it. 15:47.488 --> 15:54.860 [JM]: It also makes me wonder how prevalent this is, because the author says that they work for a Fortune 100 company. 15:54.880 --> 15:57.164 [JM]: So this isn't some small company somewhere. 15:57.224 --> 15:59.648 [JM]: This is one of the biggest companies on the planet. 15:59.628 --> 16:04.052 [JM]: And they go on to say, talking to my peers, most are like this. 16:04.072 --> 16:05.213 [JM]: Some are far worse. 16:05.673 --> 16:09.757 [JM]: Now, again, this is just one person posting on the internet who knows if it's true. 16:10.238 --> 16:16.263 [JM]: But if it is true, it sounds like this is relatively prevalent and troubling. 16:16.283 --> 16:21.988 [JM]: By the way, if you get a chance, sift through the thread, because as usual, there will be a link in the show notes. 16:22.409 --> 16:24.530 [JM]: And there is some good stuff in this thread. 16:24.550 --> 16:29.635 [JM]: Some of my favorites include, this makes Dilbert look like family circus. 16:29.615 --> 16:35.242 [DJ]: So this makes one crappy cartoon look like another crappy cartoon? 16:35.443 --> 16:47.078 [JM]: Sorry, the creators of Dilbert and of Family Circus, but... But my favorite was this quote, intelligence is literally hallucinogenic automatron data manipulation as a service. 16:47.558 --> 16:50.442 [JM]: Is automatron a word or... 16:50.422 --> 16:51.183 [JM]: Thank you. 16:52.064 --> 16:57.691 [JM]: Thank you so much, by the way, because I saw this and I'm like, Automatron? 16:58.131 --> 16:59.894 [JM]: Did they mean "automaton"? 17:00.254 --> 17:05.801 [JM]: It looks like Automatron is from some DLC expansion pack for the game Fallout 4. 17:06.061 --> 17:08.344 [JM]: So I don't know which was intended here. 17:08.865 --> 17:10.947 [JM]: It seems like either could be totally valid. 17:11.107 --> 17:12.489 [DJ]: I don't think I downloaded that one. 17:12.849 --> 17:13.550 [DJ]: Is it on sale? 17:13.871 --> 17:15.012 [DJ]: Got to get my hands on it. 17:14.992 --> 17:18.979 [DJ]: It's a hallucinatronic auto-ma-jing-a-ma-bob. 17:19.500 --> 17:32.562 [DJ]: I'm sorry to make fun of you, person on the internet I don't know, but I have to admit that while I am somewhat in alignment with the concerns of some of the people posting in this thread— 17:32.542 --> 17:34.706 [DJ]: They lose me at the tone a lot of the time. 17:34.726 --> 17:37.490 [DJ]: Well, all things in moderation, right? 17:37.630 --> 17:52.314 [DJ]: And I think I have a similar problem with the people who think that LLM-powered computer programs represent the next wave of evolution of life, as I do with the people who write them off entirely. 17:52.575 --> 17:59.646 [DJ]: But then again, I think maybe the real place we can all aim our criticisms is in the behavior of human beings as usual. 17:59.626 --> 18:01.729 [DJ]: I use AI-assisted coding. 18:01.749 --> 18:03.751 [DJ]: It does things that are annoying. 18:03.771 --> 18:19.952 [DJ]: I kind of agree with the person that a lot of the time, in particular the autocomplete, where you start typing something and now your editor just suggests a whole bunch of stuff a lot of the time, which is not helpful, is a little bit like someone playing a kazoo in your ear while you're trying to get work done. 18:20.112 --> 18:21.454 [DJ]: I agree that it's annoying. 18:21.474 --> 18:29.464 [DJ]: It never provokes an anxiety attack in me, but I think that's because my boss has never intimated that I might get fired if I don't use this stupid thing. 18:29.444 --> 18:31.608 [DJ]: And I think that's what makes all the difference. 18:31.628 --> 18:49.444 [DJ]: You know, you fabricated that example before about like 10% productivity versus, and I thought the point makes sense and the numbers don't because I don't even think it's true that you could possibly do something that alienates a large proportion of your own employees and also simultaneously see a rise in productivity. 18:49.424 --> 19:00.092 [DJ]: The thing that it points out to me, and this is probably true in lots of corporate environments, is a loss of the entire understanding of what work is for. 19:00.493 --> 19:03.781 [DJ]: Like, what's the point of this Fortune 100 company? 19:04.162 --> 19:05.205 [DJ]: You know, a certain... 19:05.185 --> 19:11.453 [DJ]: perspective would say that the point of it is to maximize returns for its shareholders, capitalism, etc. 19:11.553 --> 19:28.695 [DJ]: But like, to the extent that human beings work, we do labor, and there is something kind of noble or elevating in that in and of itself, then I think the only way you can approach things like, what do we do about these AI-assisted tools, 19:28.675 --> 19:36.817 [DJ]: is to work with the human beings you employ to figure out how they fit in, how the tools fit in with people. 19:37.198 --> 19:44.136 [DJ]: And these horror stories that we're starting to hear, how prevalent they are we don't know, but there are more than zero of them, 19:44.116 --> 19:47.300 [DJ]: suggest that some people have that the other way around. 19:47.320 --> 19:54.509 [DJ]: And instead, they're trying to make human beings, they're trying to cram human beings into a differently shaped box. 19:55.070 --> 20:04.182 [DJ]: And cramming human beings into spaces has always been the wrong thing to do, regardless of the reasons that an employer had for doing it. 20:04.502 --> 20:22.000 [DJ]: It is unfortunate to see the sorts of labor practices that we might have generally associated with an early 20th-century assembly line coming for the people who used to be the super-stars of labor — software developers. 20:22.001 --> 20:24.940 [JM]: Yeah, I think Upton Sinclair would have thoughts. 20:24.920 --> 20:38.816 [JM]: Earlier today, I was working on an open source project, and I realized that the logger in this particular thing I was doing wasn't printing info level statements to the console in the same way that it used to. 20:39.297 --> 20:41.800 [JM]: So apparently the behavior of Python changed at some point. 20:42.401 --> 20:49.489 [JM]: And I found someone asking this question on Stack Overflow, and there's an answer and says, oh, yeah, later versions of Python, like you have to do it this way. 20:50.070 --> 20:54.555 [JM]: And I have that window open on the left, and I have my editor open on the right, and I'm typing. 20:54.535 --> 20:56.998 [JM]: in code that I'm looking at on the left. 20:57.298 --> 21:07.149 [JM]: And as I mentioned in a recent episode, I now have a Vim plugin that connects to a local large language model running on my machine via llama.vim. 21:07.629 --> 21:19.682 [JM]: And as I'm typing the first few letters of this line that I'm just trying to copy out of this window on the left from Stack Overflow, it just completes like the next three lines exactly the way it is over on Stack Overflow. 21:19.662 --> 21:23.787 [JM]: For people who are accustomed to this and used to it, it's probably no big deal. 21:23.847 --> 21:25.689 [JM]: It's just, yeah, that's how it works, dude. 21:25.709 --> 21:27.191 [JM]: Like that's the beginning and the end of it. 21:27.611 --> 21:30.855 [JM]: These are tools that I'm only recently starting to experiment with. 21:31.015 --> 21:33.378 [JM]: And that happened and I just paused. 21:33.518 --> 21:35.961 [JM]: I was like, whoa, I did not expect that to happen. 21:35.981 --> 21:38.423 [JM]: Like that's a lot less typing that I have to do. 21:38.443 --> 21:45.772 [JM]: I mean, granted, I probably could have copied and pasted it, but I would have had to do it in a couple of operations just to get it formatted right. 21:46.132 --> 21:47.694 [JM]: And this just did it all at once. 21:47.814 --> 21:49.416 [JM]: And I was like, wow, that's pretty cool. 21:49.396 --> 21:55.006 [JM]: Now, I've seen it, as you said before, spit out things that are totally irrelevant, but I just don't tab complete those things. 21:55.347 --> 21:59.976 [JM]: So I'm starting to see some of the value of these tools in what I'm doing. 22:00.497 --> 22:07.229 [JM]: Even if it's just stuff I'm already looking at and couldn't easily have copy and pasted, it's still saving me a bit of typing or copying and pasting. 22:07.289 --> 22:08.772 [JM]: And at the same time, 22:08.752 --> 22:10.577 [JM]: No one's telling me I have to do this. 22:10.597 --> 22:12.562 [JM]: Like, I'm doing this as an experiment. 22:12.622 --> 22:14.667 [JM]: I'm doing this to see if there's value. 22:15.229 --> 22:16.692 [JM]: And I am getting some value from it. 22:17.314 --> 22:17.815 [JM]: And it's fun. 22:18.497 --> 22:23.450 [JM]: And the moment that it stops being fun, even if it's giving me value, is the moment that I stop doing it. 22:23.931 --> 22:24.432 [JM]: And... 22:24.412 --> 22:29.399 [JM]: I understand that I am saying that from a position of privilege and I want to fully recognize that. 22:29.779 --> 22:31.562 [JM]: But that is the position I'm in. 22:31.582 --> 22:38.691 [JM]: And I think it's terrible that there are people that are in what sounds like very different positions and I don't envy them. 22:38.711 --> 22:44.479 [JM]: I saw this related thing that someone created and I'll put a link to it. 22:44.459 --> 22:47.985 [JM]: In the show notes, it's hard to describe, but it's essentially an animation. 22:48.125 --> 22:55.096 [JM]: And if you watch it, I recommend doing it in some kind of video player where you can pause it because otherwise you're going to miss some of it. 22:55.517 --> 22:57.921 [JM]: And you can kind of scrub along to get it. 22:57.941 --> 23:04.071 [JM]: But essentially what this is, and I'll just describe it briefly, is someone typing and then having... 23:04.051 --> 23:15.874 [JM]: the at-large language model complete it, and they're just putting all kinds of pithy, ridiculous things in there to kind of poke fun at sometimes how useless these things can be, right? 23:15.995 --> 23:18.179 [JM]: Because they can be useless sometimes. 23:18.199 --> 23:25.333 [JM]: One of the completions as this person is typing is: "Having a computer constantly trying to guesstimate what I'm going to type is..." 23:25.313 --> 23:30.640 [JM]: And they're starting to type the letter A and then it completes it as: "... astonishing! 23:30.660 --> 23:32.683 [JM]: They appear so quickly, so frequently, so much! 23:32.904 --> 23:33.865 [JM]: It really helps me focus! 23:34.286 --> 23:41.796 [JM]: It feels like I've got a golden retriever in my head and it's bringing me random household objects directly to me in the desperate hope that I'll smile back at it and give it half a pet!" 23:42.157 --> 23:46.022 [DJ]: That is actually a pretty accurate description of what it's like. 23:46.137 --> 23:50.705 [DJ]: What you said before about when it stops being fun, I stop doing it. 23:50.725 --> 24:04.551 [DJ]: That's exactly what I think is so wrongheaded about these mandates, because I've also had this experience of you start writing some code and the thingamabob says, I think the next five lines you want to write are this. 24:04.531 --> 24:08.278 [DJ]: And sometimes you go, yep, great, that saved me some time. 24:08.298 --> 24:14.230 [DJ]: Because let's be honest, when you're writing computer code, most of what you write is not brilliant. 24:14.670 --> 24:17.315 [DJ]: It's not an expression of your deep creative self. 24:17.355 --> 24:18.858 [DJ]: It's boilerplate for the most part. 24:18.898 --> 24:22.666 [DJ]: It's like, yeah, open a database connection for the thousandth time. 24:22.686 --> 24:24.970 [DJ]: Take some data out of this map and put it in that map. 24:24.950 --> 24:27.014 [DJ]: It's nice to have that stuff automated. 24:27.374 --> 24:32.143 [DJ]: But on the other hand, a lot of the time this thing goes, oh, you probably want this. 24:32.303 --> 24:35.348 [DJ]: And I'm like, no, I don't. 24:35.368 --> 24:36.110 [DJ]: Please go away. 24:36.550 --> 24:42.120 [DJ]: Sometimes, and perhaps this is getting less common as we go, these things go truly off the rails. 24:42.200 --> 24:48.932 [DJ]: And one of the funniest things I ever saw was I wrote the method header for a unit test that was like, test the thing. 24:48.912 --> 24:56.908 [DJ]: And it helpfully suggested that the next lines should be test the thing, the thing, test the thing, the thing, the thing, the thing, test the thing, the thing, the thing, the thing, the thing. 24:56.968 --> 24:59.613 [DJ]: And on and on and on and on and on for like 30 lines. 24:59.693 --> 25:04.483 [DJ]: And I ended up taking a screenshot of that and posting it in my team Slack. 25:04.503 --> 25:07.208 [DJ]: And I said something like, go home, Copilot, you're drunk. 25:07.188 --> 25:09.452 [DJ]: This cost $30 billion to train. 25:09.832 --> 25:10.653 [DJ]: Yeah, exactly. 25:10.914 --> 25:11.936 [DJ]: But so that's just it. 25:11.976 --> 25:17.645 [DJ]: Is it like these tools are in such a nascent state that sometimes they're awesome and other times they're not awesome. 25:17.985 --> 25:23.554 [DJ]: And acknowledging that our attitude towards them should be, well, let's see. 25:23.574 --> 25:24.575 [DJ]: Let's see how it does. 25:25.116 --> 25:27.480 [DJ]: But like, if it sucks, stop doing it. 25:27.500 --> 25:30.305 [DJ]: Instead of the like, you have to use this or you're fired. 25:30.525 --> 25:35.052 [DJ]: Even if it's like, even if it's doing nonsense, that's just that is not going to lead us anywhere good. 25:35.639 --> 25:36.741 [JM]: Okay, moving on. 25:36.921 --> 25:40.226 [JM]: I just wanted to touch on this small article. 25:40.246 --> 25:44.612 [JM]: And I just say small because it was not particularly meaningful or eventful. 25:44.753 --> 25:48.298 [JM]: It was someone, well, let's just start with the title of the article. 25:48.418 --> 25:54.407 [JM]: The title of the article is, The Real Future of AI is Ordering Mid-Chicken at Bojangles. 25:54.387 --> 26:00.277 [JM]: And so right off the bat, I mentioned this article to someone and they're like, I'm sorry, did you say "mid"? 26:00.457 --> 26:01.138 [JM]: Yeah, mid chicken. 26:01.219 --> 26:01.840 [JM]: And I'm like, yeah. 26:02.260 --> 26:03.823 [JM]: And what the heck does that mean? 26:04.364 --> 26:07.269 [JM]: And I said, oh, you don't know that's Zoomer speak. 26:07.289 --> 26:08.491 [JM]: And they're like, okay, well, wait, hold on. 26:08.511 --> 26:09.092 [JM]: What's a "Zoomer"? 26:10.695 --> 26:12.257 [JM]: I feel like I don't understand what's happening. 26:12.418 --> 26:14.421 [DJ]: Wait, wait, wait, wait, wait. 26:14.570 --> 26:20.798 [DJ]: Did you go back in time and you were talking to Abraham Lincoln, former president of the United States of America? 26:20.818 --> 26:22.481 [DJ]: Is that who you were talking to? 26:22.501 --> 26:24.584 [DJ]: Were you in King Arthur's court? 26:26.927 --> 26:35.158 [JM]: And so I went on to explain that my understanding is that a Zoomer is like a boomer, but for Generation Z. So that's what a Zoomer is. 26:35.218 --> 26:37.862 [JM]: It's a member of the Generation Z, I guess. 26:37.922 --> 26:38.222 [JM]: I don't know. 26:38.663 --> 26:40.425 [JM]: I don't even understand half the stuff myself, but... 26:40.405 --> 26:43.791 [JM]: Somehow I understood enough to explain this to this person. 26:43.811 --> 26:49.342 [JM]: And I said, okay, well, a Zoomer would say something is "mid" when they mean that it's like, fine. 26:49.643 --> 26:50.945 [JM]: It's like, it's not great. 26:51.226 --> 26:51.987 [JM]: It's not terrible. 26:52.167 --> 26:52.548 [JM]: It's mid. 26:52.889 --> 26:55.053 [JM]: So that's apparently what that means. 26:55.073 --> 26:55.714 [JM]: You're welcome. 26:55.694 --> 26:58.437 [DJ]: So apparently "glizzy" means hot dog. 26:58.878 --> 27:02.042 [DJ]: And that one I never could have figured out without having had it explained to me. 27:02.222 --> 27:04.024 [DJ]: And I might also have been tricked. 27:04.244 --> 27:09.230 [DJ]: But like "mid" at least, you can kind of infer what that probably means. 27:09.290 --> 27:12.114 [DJ]: Do you think that mid means great? 27:12.134 --> 27:13.215 [DJ]: Like, probably not. 27:13.776 --> 27:13.956 [DJ]: Right. 27:14.336 --> 27:17.380 [DJ]: Incidentally, our sponsor for this episode, Bojangles. 27:17.783 --> 27:19.525 [JM]: That should have been the real question, right? 27:20.326 --> 27:23.991 [JM]: As I'm describing this to someone, they should have been like, what the heck is Bojangles? 27:24.011 --> 27:25.053 [DJ]: What's a Bojangles? 27:25.113 --> 27:25.834 [DJ]: Yeah, exactly. 27:26.194 --> 27:27.255 [DJ]: What's a Zoomer? 27:27.476 --> 27:29.719 [DJ]: Anyway, what's this article about, Justin? 27:29.739 --> 27:44.018 [JM]: So this article is about someone who went to order chicken at the drive-thru at Bojangles, which I guess is some chain of mid-chicken somewhere in a part of the world that I don't live because I've never heard of Bojangles. 27:44.298 --> 27:45.900 [JM]: It's a mid-chicken chain, yeah. 27:45.880 --> 28:00.633 [JM]: And apparently some company sold this LLM-powered tool to Bojangles to install in their drive-thrus so that people can order their chicken without talking to an actual human. 28:01.354 --> 28:06.439 [JM]: And the author of this article just describes the experience as being somewhat pedestrian. 28:06.819 --> 28:12.504 [JM]: They go, they place their order, and their only real complaint was the constant upsells. 28:12.564 --> 28:14.726 [JM]: It was like, do you want watermelon juice with your order? 28:14.986 --> 28:15.887 [JM]: No, I don't. 28:15.867 --> 28:17.750 [JM]: Do you want a cherry pie with that? 28:17.890 --> 28:19.392 [JM]: No, no, thank you. 28:19.412 --> 28:23.899 [JM]: But aside from the upselling, this process appears to have been very efficient. 28:24.300 --> 28:25.421 [JM]: There were no mistakes made. 28:25.862 --> 28:29.147 [JM]: The human at the window handed this person the order. 28:29.227 --> 28:30.329 [JM]: The order was correct. 28:30.349 --> 28:32.592 [JM]: So no complaints, really. 28:32.612 --> 28:38.822 [JM]: I think the point of the article, the reason this person wrote it was just to say, like, it was fine. 28:39.042 --> 28:41.706 [JM]: It was not particularly notable. 28:41.686 --> 28:46.190 [JM]: And this is something that is going to be the future. 28:46.430 --> 28:53.557 [JM]: It's probably not going to be Skynet firing nuclear missiles and wiping humanity off the face of the planet. 28:53.817 --> 29:01.000 [JM]: It's just going to be us dealing with LLM software in place of people like when we call our bank and say like, hey... 29:01.500 --> 29:09.251 [DJ]: Operator, operator. "If you'd like to hear your bank account balance, please say 'balance'". Operator! 29:09.591 --> 29:10.832 [DJ]: That's what it's already like. 29:10.964 --> 29:12.666 [JM]: Well, yes and no. 29:12.686 --> 29:20.654 [JM]: Presumably the software that Bojangles has procured and installed at these windows is a little bit more sophisticated, right? 29:20.674 --> 29:24.638 [JM]: Granted, the questions are somewhat predetermined, right? 29:24.678 --> 29:30.324 [JM]: Like there's a limited number of things that you want when you go to order chicken at Bojangles, right? 29:30.745 --> 29:36.651 [DJ]: I think that's what I meant when I made that joke, though, was that like, yes, it's a slightly more sophisticated version, but it's 29:36.631 --> 29:47.001 [DJ]: replacing a human being for the same kind of task, which is effectively selecting some options from some finite set so that you can get your thing. 29:47.021 --> 29:57.131 [DJ]: I guess there's a question in there about like an obvious criticism is like, oh, well, we're surrendering real human interaction to machines, but you have to ask about the quality of the interaction. 29:57.211 --> 30:02.817 [DJ]: Like, does anyone on either side of a traditional drive-through food transaction get anything out of that? 30:03.017 --> 30:03.077 [JM]: No. 30:03.057 --> 30:07.164 [JM]: Well, I think my point is when you say, you know, operator, because that's what I do. 30:07.184 --> 30:19.243 [JM]: That's what most people I know do is you say like live agent, representative, operator, whatever your keywords are, you just start throwing them out there so that this automated system will route you to a human. 30:19.544 --> 30:22.348 [JM]: The reason that we're doing that is because it sucks. 30:22.328 --> 30:23.190 [DJ]: Oh, right. 30:23.210 --> 30:23.911 [DJ]: I see your point. 30:24.192 --> 30:24.472 [JM]: Yeah. 30:24.572 --> 30:27.237 [JM]: If it were good, we wouldn't be doing that. 30:27.798 --> 30:32.728 [JM]: And presumably what this article is trying to get across is like this person didn't need to do that. 30:32.788 --> 30:37.036 [JM]: They didn't need to escalate to a human because this software behaved just fine. 30:37.597 --> 30:40.883 [JM]: And I think that's soon going to become the norm. 30:41.024 --> 30:44.370 [JM]: Oh, and also that assumes that you know what's happening. 30:44.350 --> 30:44.751 [JM]: Right. 30:44.771 --> 30:52.101 [JM]: Because like most of the time when we talk to these systems, when we call our banks and whatnot, it's real clear that we're not talking to a human. 30:52.481 --> 30:56.026 [JM]: But if instead it behaved like, "Hi, this is Alice. 30:56.046 --> 30:56.607 [JM]: How can I help you?" 30:57.448 --> 31:00.332 [JM]: And you asked her question and you got an answer. 31:00.753 --> 31:02.916 [JM]: And the entire interaction from beginning to end. 31:02.896 --> 31:04.879 [JM]: You couldn't tell if you were talking to a human? 31:04.899 --> 31:09.626 [JM]: Well, on some level, I think we can all agree that feels uncomfortable. 31:09.646 --> 31:12.751 [JM]: But if you really can't tell, I don't know that it's bad. 31:12.931 --> 31:15.500 [JM]: I don't know. I mean, I don't like it, but... 31:15.520 --> 31:32.700 [DJ]: There's an assumption, and I think what I don't like is the assumption, which is that if you had a system that can effectively operate on natural language to achieve the desired result, there is no actual reason that it needs to pretend to be a human. 31:32.680 --> 31:33.662 [DJ]: You just want the result. 31:33.682 --> 31:37.748 [DJ]: You're not calling your bank because you really want to connect with Alice. 31:38.329 --> 31:45.280 [DJ]: So if these systems are good enough, you could just call your bank and it could just be like, "Hey, we're going to talk natural language." 31:45.481 --> 31:48.265 [DJ]: "But like you and I both know I'm a computer, right?" 31:48.285 --> 31:49.427 [DJ]: And you're like, "Yeah, I get it. 31:49.447 --> 31:51.070 [DJ]: So tell me my bank account balance." 31:51.170 --> 31:52.712 [DJ]: "Well, it looks like blah, blah, blah, blah", right? 31:52.732 --> 31:53.694 [DJ]: Like that would be fine. 31:53.894 --> 31:55.617 [DJ]: The problem enters when 31:55.597 --> 32:05.424 [DJ]: we just accept the notion that because these, like, if these systems get good enough, they can make up personalities for themselves, and that's what they should do. 32:05.444 --> 32:06.286 [DJ]: Maybe they shouldn't. 32:06.727 --> 32:08.232 [DJ]: Let the robots be robots. 32:08.252 --> 32:09.415 [DJ]: I think that's what I'm saying. 32:09.776 --> 32:10.618 [JM]: That's a good point. 32:10.598 --> 32:14.561 [JM]: It is better not to be deceptive about it and to just to be up-front. 32:14.821 --> 32:18.261 [JM]: Because then it's clear, and also, you don't care, right? 32:18.581 --> 32:24.561 [JM]: Because you might prefer to talk to a human, but in the end, most of us don't want to be making the phone call in the first place, right? 32:24.821 --> 32:27.561 If we could do this from the web, we would have already done it! 32:27.541 --> 32:33.090 [JM]: The whole reason that we're calling is because we couldn't and we're trying to get this thing we couldn't solve... solved. 32:33.551 --> 32:47.273 [JM]: And if some piece of software can solve it more efficiently, if it means that I don't have to sit in a queue and wait for 10 to 15 minutes for some operator to come on the line and I can get it done quickly and I don't really notice the difference. 32:47.374 --> 32:49.437 [JM]: Oh, and I know that I'm talking to a piece of software and don't care. 32:49.838 --> 32:50.098 [JM]: Great. 32:50.379 --> 32:52.502 [JM]: I really can't complain with that too much. 32:52.482 --> 33:10.110 [DJ]: Yeah, I have mixed feelings about it because there was a time recently when I was using a car share car and I got confused about something and I accidentally left the car in a problematic state where I had wanted to end my rental, but I'd like locked the car and couldn't get access to it. 33:10.130 --> 33:12.013 [DJ]: But it was still checked out to me. 33:12.553 --> 33:15.218 [DJ]: Usually you operate these cars from a phone app. 33:15.238 --> 33:18.743 [DJ]: And so I had gotten myself into one of the rare circumstances where like 33:18.723 --> 33:19.964 [DJ]: I need a person. 33:20.545 --> 33:21.045 [DJ]: I need help. 33:21.185 --> 33:23.167 [DJ]: Like, raising my hand, please come help. 33:23.187 --> 33:27.991 [DJ]: So I had to call the hotline, and I got a helpful person. 33:28.011 --> 33:30.593 [DJ]: And then later that same day, my parents were visiting. 33:30.753 --> 33:33.135 [DJ]: I was driving them around in these car share cars. 33:33.155 --> 33:36.518 [DJ]: Later that day, my mom thought she left, like, her bag in the car. 33:36.899 --> 33:41.342 [DJ]: So I called their hotline again, and I'm pretty sure I got the same person. 33:41.703 --> 33:44.946 [DJ]: And you have an account associated with this, so they know who they're talking to. 33:45.206 --> 33:47.768 [DJ]: So I managed, we found my mom's bag. 33:47.748 --> 33:53.362 [DJ]: And I managed to have a, you know, three-second long friendly exchange with this person where I was like, oh, you know, I got it sorted. 33:53.402 --> 33:58.816 [DJ]: And they were just like, you know, and they're just, I don't know, they said something about being glad they could help today or whatever. 33:58.836 --> 34:03.748 [DJ]: And I realize I don't need a human being in that situation anymore. 34:03.728 --> 34:07.916 [DJ]: Really all I... Because to your point, like, I didn't want to be in this situation in the first place. 34:07.956 --> 34:11.323 [DJ]: If there was just a button I could have pushed, I would have pushed it. 34:11.523 --> 34:15.831 [DJ]: But that being said, I kind of liked having an interaction with another human being. 34:15.931 --> 34:19.378 [DJ]: But on the other hand, it's not like I don't know who that person is. 34:19.418 --> 34:20.680 [DJ]: I'm never going to speak to them again. 34:20.720 --> 34:23.606 [DJ]: It's not like I actually created some sort of a relationship. 34:23.706 --> 34:24.167 [DJ]: I guess... 34:24.147 --> 34:29.620 [DJ]: This whole thing is just making me think with increasing automation, it reduces human interaction. 34:29.921 --> 34:30.663 [DJ]: Okay, fine. 34:30.683 --> 34:37.740 [DJ]: But then that – I feel this way about a lot of this generative AI stuff where it forces us to re-examine assumptions that we've made. 34:38.221 --> 34:41.228 [DJ]: And in this case, I guess the assumption I'm re-examining is: 34:41.208 --> 34:48.905 [DJ]: Do those sort of unintentional human interactions that I've had to have because a human was the only way to get your car unlocked, do those matter? 34:49.487 --> 34:58.086 [DJ]: Or are there other sorts of human interactions I ought to be pursuing with the extra time I save because the LLMs are taking care of my drive-thru orders? 34:58.066 --> 34:58.828 [DJ]: I don't know. 34:58.848 --> 34:59.910 [JM]: It's a good question. 34:59.930 --> 35:03.697 [DJ]: It's almost as good a question as who wants watermelon juice as an upsell. 35:03.717 --> 35:05.341 [DJ]: I'm going to assume that's a regional thing. 35:05.441 --> 35:06.002 [JM]: Beats me. 35:06.042 --> 35:12.074 [JM]: There are definitely times where I have interactions with people in a customer service context, and I agree with you. 35:12.094 --> 35:13.898 [JM]: It is a point of human connection. 35:13.878 --> 35:17.304 [JM]: And those, I think, are going to gradually go away. 35:17.724 --> 35:20.769 [JM]: For the most part, if I'm being honest, that's probably a win. 35:21.050 --> 35:27.640 [JM]: Because I would say that probably the majority of these interactions don't give me that feeling. 35:27.680 --> 35:32.949 [JM]: They don't give me the feeling of like, oh, I'm glad I connected with another human, even on a subconscious level. 35:33.009 --> 35:40.882 [JM]: I think most of the time, my interactions with humans in this kind of context are often annoying, if I'm being totally honest. 35:40.962 --> 35:43.044 [JM]: And if this solves that, that's great. 35:43.445 --> 35:44.706 [JM]: But you're right. 35:44.726 --> 35:48.990 [JM]: It does have the potential to eliminate some of the nicer interactions that sometimes happen. 35:49.591 --> 35:54.736 [JM]: And ultimately, how we feel about it, whether we like it or not, it's kind of irrelevant, right? 35:54.756 --> 36:00.222 [JM]: Like that's like the point of this article is that this is just going to happen and it's going to become normal. 36:00.462 --> 36:03.725 [JM]: And whether you like it or don't like it isn't going to matter very much. 36:04.286 --> 36:07.189 [JM]: I don't really see companies saying like, "you know what? 36:07.169 --> 36:17.889 [JM]: We're going to differentiate our service by staffing our customer service with humans, unlike all these other people that are outsourcing that stuff to generative software. 36:18.330 --> 36:20.654 [JM]: And that's going to be our competitive difference. 36:20.674 --> 36:23.760 [JM]: That's how we're going to differentiate our company and our product and our service." 36:23.740 --> 36:24.301 [JM]: I don't know. 36:24.481 --> 36:26.403 [JM]: I don't see that as happening. 36:26.443 --> 36:30.887 [JM]: Like, sure, it's possible, but it's also way more costly. 36:30.907 --> 36:33.689 [JM]: And I don't know that enough people care. 36:34.330 --> 36:39.255 [JM]: And I don't think that's a strategy that's going to succeed, unfortunately perhaps. 36:39.275 --> 36:43.939 [JM]: Again, maybe that's not unfortunate, depending on, like I said, maybe this is just not a question of judgment. 36:44.340 --> 36:44.800 [JM]: "This is good." 36:44.820 --> 36:45.280 [JM]: "This is bad." 36:45.300 --> 36:46.401 [JM]: It just *is*. 36:46.421 --> 36:48.003 [DJ]: I think it's more that it just is. 36:48.023 --> 36:53.308 [DJ]: Like, I actually think it might be the case that there are probably certain domains in which a 36:53.288 --> 36:59.237 [DJ]: Winning strategy in an age of increasing automation is increased human contact, but it depends. 36:59.397 --> 37:03.964 [DJ]: There's the notion of the difference between a high touch and a low touch interaction. 37:04.465 --> 37:09.913 [DJ]: When you're trying to get somebody to spend $50,000 on a car, you don't do that with a drive-thru window. 37:09.933 --> 37:16.883 [DJ]: When you're trying to get someone to spend $15 on fried chicken, you don't need a salesperson to spend an hour with them. 37:16.863 --> 37:17.124 [DJ]: Right. 37:17.144 --> 37:19.528 [DJ]: Like those are there are those kinds of distinctions. 37:19.588 --> 37:28.787 [DJ]: But but I think regardless, the point of the article remains, which is in the same way that the Internet gradually became completely ubiquitous 20 years ago, 37:29.368 --> 37:33.677 [DJ]: LLMs taking care of a lot of sort of low-value human interaction. 37:34.498 --> 37:36.382 [DJ]: I'm saying, yeah, I'm calling it low value. 37:36.402 --> 37:37.825 [DJ]: You can argue with me if you'd like. 37:37.805 --> 37:39.608 [DJ]: That is probably just going to happen. 37:40.109 --> 37:48.784 [DJ]: And yeah, and to some extent, we are for the most part just going to 10 years from now, it's just going to feel normal that when you drive-through a drive-thru, you're not talking to a person anymore. 37:48.884 --> 38:00.023 [DJ]: The person is spending their time inside preparing the food or whatever, instead of having the little earpiece going like, I mean, that's how every drive through I've ever gone through is sounded. 38:00.043 --> 38:01.365 [DJ]: I don't know about you. 38:01.345 --> 38:11.707 [DJ]: Something that really interests me is that notion of with gen AI disrupting a lot of assumptions about things, I think it calls on us to get intentional. 38:11.727 --> 38:18.882 [DJ]: So thinking about this in terms of generating content to go back to the AI assisted coding and writing and stuff like that. 38:18.862 --> 38:22.469 [DJ]: And AI-assisted art generation, that's a super hot topic. 38:22.830 --> 38:23.972 [DJ]: Do I think it's good? Bad? 38:24.112 --> 38:24.793 [DJ]: I don't know. 38:25.014 --> 38:34.612 [DJ]: I think the fact that this exists now, it calls on people to re-examine, well, what does it mean to be a writer if a computer program can generate text? 38:35.013 --> 38:37.157 [DJ]: Again, I'm not saying the computer program 38:37.137 --> 38:40.420 [DJ]: should or is good at generating the text. 38:40.440 --> 38:47.928 [DJ]: I'm saying that it's time to re-evaluate what human creativity is when it is not merely the generation of stuff. 38:48.208 --> 38:54.915 [DJ]: And I think likewise, we are called on to re-evaluate what does human connection mean? 38:55.216 --> 38:56.057 [DJ]: What is it about? 38:56.297 --> 38:57.758 [DJ]: When is it important? 38:57.778 --> 39:01.322 [DJ]: And how do we really develop it? 39:01.302 --> 39:05.206 [DJ]: in a world where it's easier and easier to not have to have it. 39:05.226 --> 39:07.689 [DJ]: I've lived in big cities for pretty much my whole life. 39:07.889 --> 39:09.510 [DJ]: I've lived in apartment building. 39:09.530 --> 39:20.622 [DJ]: And one thing I'll notice is that it seems like the more people there are in closer proximity to each other, the more we all just sort of reflexively try to pretend no one else exists. 39:20.642 --> 39:26.668 [DJ]: Like part of this might just be my personality versus other people's personality, but I'm not big on... 39:26.648 --> 39:34.097 [DJ]: stopping and having conversations with all of my neighbors and really getting to know them and like smiling at strangers as I walk down the street. 39:34.117 --> 39:37.661 [DJ]: And most people in the city where I live, most people don't do that. 39:37.681 --> 39:45.450 [DJ]: I'm under the impression that in smaller communities, it's much more common for people to stop and talk to each other and get to know each other. 39:45.470 --> 39:56.082 [DJ]: Again, maybe I'm wrong, but to the extent that like increased population density has changed the way the circumstances under which people will and won't interact with each other, 39:56.062 --> 40:07.855 [DJ]: I think something similar is going on here, where we could say, well, it's a good thing that you're being forced to interact with a person in a call center in Ireland in order to learn your bank balance because that's human interaction. 40:08.275 --> 40:10.097 [DJ]: Well, maybe, but that's going away. 40:11.018 --> 40:21.189 [DJ]: And one way or another, I think that is an interesting opportunity for us to go, well, wait, if we still value human interactions, and *we do*, how do we get more of them? 40:21.209 --> 40:23.672 [DJ]: And how do we get more of the ones that really do matter? 40:23.652 --> 40:25.797 [JM]: Yeah, I'm not sure I know the answer to that. 40:25.817 --> 40:35.477 [JM]: It does seem like a lot of people increasingly are getting that need fulfilled via this kind of generative software. 40:35.497 --> 40:38.704 [JM]: And that is its own topic that we touched on in a previous episode. 40:39.065 --> 40:41.610 [JM]: But I mean, you know, maybe the answer... 40:41.590 --> 40:49.560 [JM]: when you're looking for that kind of connection, is you just go and try to find a nice, neurospicy automaton at Bojangles. 40:50.241 --> 40:54.767 [DJ]: Is there some way you could work more Zoomer lingo into that description? 40:56.188 --> 40:57.610 [JM]: All right, that's all for this episode. 40:57.670 --> 40:59.473 [JM]: Thanks, everyone, for listening. 40:59.493 --> 41:04.579 [JM]: You can find me on the web at justinmayer.com, and you can find Dan at danj.ca. 41:05.240 --> 41:09.285 [JM]: Please reach out with your thoughts via the Fediverse at justin.ramble.space.