WEBVTT 00:00.031 --> 00:05.962 [JM]: As you know, we like to start off the show usually by highlighting something interesting that we have found on the internet. 00:06.403 --> 00:11.392 [JM]: And man, did this week ever deliver in my humble opinion. 00:11.453 --> 00:20.149 [JM]: I would like to introduce you to an article called "My Journey to the Microwave Alternate Timeline". 00:20.129 --> 00:28.219 [JM]: Posted on a site called Less Wrong, which is a site we have mentioned before because we have seen other great posts that have appeared there. 00:28.619 --> 00:40.173 [JM]: This is one of those articles where I start reading it and by the second or third paragraph, not only am I fully in, but I am just cracking up. 00:40.573 --> 00:41.835 [JM]: The writing is amazing. 00:41.875 --> 00:48.803 [JM]: There's sometimes words that are used that I have to look up, which always makes me happy because... 00:48.783 --> 00:51.986 [JM]: I love learning new things and new words are a fun new thing to learn. 00:52.527 --> 00:57.431 [JM]: And in the interest of full disclosure, I have not fully read this article. 00:57.892 --> 00:59.173 [JM]: I have read about half of it. 00:59.633 --> 01:04.118 [JM]: And this is a little bit of an insight into the wild mind that is Justin. 01:04.198 --> 01:15.969 [JM]: And that is that I am one of those people that sometimes if I go on a trip and I buy some snack in some foreign land and I really like it and I go back home, I will eat some of it, but then like leave some of it. 01:16.249 --> 01:18.351 [JM]: So because I don't want it to be over yet. 01:18.331 --> 01:19.714 [JM]: And I'll try to drag it out. 01:20.155 --> 01:21.899 [JM]: And to some degree, I think that's what happened here. 01:21.939 --> 01:27.411 [JM]: It was too good to waste it, to use a weird word, all in one sitting. 01:27.852 --> 01:30.278 [JM]: So I don't know how this story ends. 01:30.679 --> 01:37.875 [JM]: But the beginning was so good that I didn't really need to read the rest of it to recommend it to you, Dan, and to everyone. 01:37.855 --> 01:40.678 [JM]: you all of our listeners. 01:40.698 --> 01:48.426 [JM]: And the gist of it is this person's exploration of the dawn of microwave cooking in the 1980s. 01:49.347 --> 01:58.897 [JM]: And specifically, this person's cookbook, which is apparently somewhat internet famous for being the world's saddest cookbook. 01:59.258 --> 02:02.361 [JM]: And the title of the cookbook is Microwave Cooking for One. 02:02.341 --> 02:09.407 [JM]: And even if you didn't know anything about it, you could probably guess from the title that it might qualify as the world's saddest cookbook. 02:09.427 --> 02:12.090 [JM]: There's nothing wrong with enjoying a meal alone, Justin. 02:13.191 --> 02:14.052 [JM]: Definitely not. 02:14.072 --> 02:16.153 [JM]: I think it's the combination, right? 02:16.234 --> 02:23.580 [JM]: It's the idea of microwaving your food for yourself and then dedicating an entire cookbook to doing that. 02:23.600 --> 02:29.606 [JM]: I think that's where the internet granted title of world's saddest cookbook comes from. 02:29.646 --> 02:32.348 [JM]: So the person's goal in writing this was, 02:32.328 --> 02:33.630 [JM]: apparently twofold. 02:34.251 --> 02:38.998 [JM]: One was to know if there was any merit to all these forgotten microwave techniques. 02:39.478 --> 02:49.392 [JM]: And the other was to get a glimpse of what the world would look like if the future envisioned by people like the person who wrote this cookbook actually came to pass. 02:49.493 --> 02:58.986 [JM]: Like what would our world look like if instead of it being somewhat of a momentary fad, this idea that like you could just microwave everything, you cook all your food that way. 02:59.026 --> 03:01.069 [JM]: Like once that idea was 03:01.049 --> 03:04.661 [JM]: essentially discarded by humanity, what would it look like? 03:04.681 --> 03:07.752 [JM]: What would our world look like if we hadn't discarded that idea? 03:07.912 --> 03:08.494 [JM]: And so... 03:08.693 --> 03:14.481 [JM]: But going back to his first question of I want to know if there's any merit to all these forgotten microwaving techniques. 03:14.622 --> 03:20.871 [JM]: I love the sentence that he uses to support that, which is something that can make plasma out of grapes. 03:21.411 --> 03:27.500 [JM]: Set your house on fire and bring frozen hamsters back to life cannot be fundamentally bad. 03:28.662 --> 03:33.469 [JM]: And of course, the bring frozen hamsters back to life is linked again. 03:33.449 --> 03:47.396 [JM]: to a scientific paper from the Physiological Society titled Reanimation of Rats from Body Temperatures Between Zero and One Degree Celsius by Microwave Diathermy. 03:47.416 --> 03:48.498 [JM]: I don't know how to pronounce that word. 03:49.240 --> 03:50.602 [JM]: Another new word that we've learned. 03:50.642 --> 03:54.630 [DJ]: Speaking of words I've never heard before. 03:54.610 --> 03:58.836 [DJ]: Today, I learned that I have a device in my home that can reanimate rodents. 03:59.477 --> 04:01.339 [DJ]: And this changes everything, Justin. 04:01.419 --> 04:02.861 [DJ]: This changes everything. 04:03.041 --> 04:12.694 [JM]: The idea that this thing in your microwave called a magnetron can bring frozen rats back to life is really quite the revelation in and of itself. 04:13.235 --> 04:15.859 [JM]: It's worth the price of admission right there, even if you don't read the rest of it. 04:15.919 --> 04:16.299 [DJ]: That's right. 04:16.339 --> 04:18.242 [DJ]: This article is already providing you with value. 04:18.702 --> 04:22.187 [DJ]: For the record, I want the listener to know I did read the whole article. 04:22.167 --> 04:24.671 [DJ]: Unlike my perhaps lazy. 04:25.432 --> 04:27.836 [JM]: Slacker co-host. 04:27.856 --> 04:32.564 [DJ]: Although I do relate to the thing where you have something really good and you don't want to finish it. 04:32.764 --> 04:33.946 [DJ]: You want there to still be some. 04:34.307 --> 04:42.640 [DJ]: When I was a kid, notoriously, I mean, I guess notorious to my parents and no one else, but I would never finish my Halloween candy. 04:42.620 --> 04:52.493 [DJ]: In fact, if you if you graphed like consumption of Halloween candy over time, it would like start high on the day or two after Halloween and then rapidly drop off to nothing. 04:52.533 --> 05:01.164 [DJ]: To the extent that when the next Halloween would roll around, my parents would be like, OK, Dan, well, we have to throw out all your candy from last year because you're about to get a bunch more. 05:01.184 --> 05:08.273 [DJ]: And I think the reason was the same that I would I felt like I would be so sad if I didn't have any more Halloween candy left. 05:08.253 --> 05:16.523 [DJ]: that it prevented me from actually enjoying it and consuming it, as though somehow if I just never ate all of it, I could preserve the magic forever. 05:16.863 --> 05:26.795 [DJ]: Most of us eventually grow out of that mindset, and we realize that our time on Earth is transient and fleeting, and so if there's something to be enjoyed, you should just enjoy it and then move on. 05:27.116 --> 05:32.102 [DJ]: But Justin apparently is not like that, at least not when it comes to hilarious articles on the Internet. 05:32.362 --> 05:36.707 [JM]: A lesson that I have started learning but apparently haven't finished learning. 05:36.687 --> 05:40.356 [JM]: I will, as usual, put a link to this article in the show notes. 05:40.837 --> 05:45.990 [JM]: If as a reminder, you aren't sure how to find our show notes, please reach out and ask. 05:46.371 --> 05:52.907 [JM]: Happy to point you in the direction of how to find it so that you can bask in the glow of this amazing post. 05:52.887 --> 05:57.018 [DJ]: I'm glad you doubled down on letting people know about the show notes. 05:57.058 --> 06:09.630 [DJ]: But when you first started saying that, I couldn't help laughing at the notion that after all of the praise we've heaped and anticipation we've built for this article, if we didn't link to it from the show notes... 06:10.775 --> 06:11.837 [DJ]: That would just be cruel. 06:12.017 --> 06:12.478 [DJ]: Like, why? 06:13.019 --> 06:13.220 [JM]: Why? 06:13.300 --> 06:14.442 [JM]: How could we do such a thing? 06:14.883 --> 06:17.828 [JM]: And of course, dear listeners, we would never, never do that. 06:17.889 --> 06:18.870 [JM]: So it'll be there. 06:19.251 --> 06:19.552 [JM]: All right. 06:19.612 --> 06:23.439 [JM]: In other news, I wanted to talk a little bit about Mastodon for a moment. 06:24.080 --> 06:30.673 [JM]: And Mastodon, if you aren't aware, is an open source social network. 06:30.653 --> 06:35.580 [JM]: and alternative to Twitter, Facebook, and other centralized social networks. 06:36.141 --> 06:47.278 [JM]: And at the end of the show, when we talk about reaching out via the Fediverse, Mastodon is one of the more popular ways of accessing that federated social network that we refer to as the Fediverse. 06:47.258 --> 06:53.507 [JM]: And I wanted to mention this because they posted an article on their site called Mastodon is for the people. 06:53.888 --> 07:13.857 [JM]: And it's a collection of updates about what's going on with the project, one of which is they're introducing the Mastodon dot social help center, which is a new resource for folks who are arriving in Mastodon and are trying to understand how to use the software and providing some resources such as guides and tutorials about the project. 07:13.837 --> 07:23.385 [JM]: One of the other things they mention is improving the question of server discovery or instance discovery, which has been a long standing issue. 07:23.425 --> 07:34.475 [JM]: And I think probably the thing that people complain about or critique the most when it comes to the Fediverse in general and Mastodon in particular. 07:34.515 --> 07:43.843 [JM]: And that's that since there are a bunch of different servers, like hundreds and hundreds of different server instances, it's not easy to know which one you should join. 07:43.823 --> 07:57.424 [JM]: And in the beginning, as people were fleeing centralized traditional social media a few years ago, Mastodon made a decision to send new signups directly to mastodon.social. 07:57.724 --> 08:05.376 [JM]: But the problem with doing that is that it creates the very centralization that they are trying to create. 08:05.356 --> 08:06.037 [JM]: get away from. 08:06.077 --> 08:15.609 [JM]: If you're driving everyone to join a single instance, then the federation is a nice theoretical concept, not being borne out by practice. 08:15.629 --> 08:28.326 [JM]: And it sounds like that's something that they recognize as a significant priority, and they are going to make some changes, beginning with recommending the closest geographic server in the target user's 08:28.306 --> 08:32.393 [JM]: configured language, instead of just defaulting to mastodon.social. 08:32.654 --> 08:41.129 [JM]: And then they're going to experiment with some other logic in terms of how to recommend a server when people go to sign up, including some degree of randomization. 08:41.770 --> 08:49.825 [JM]: And they are soliciting input from folks to try to understand how they can make this set of changes as good as they can make it. 08:49.805 --> 09:00.322 [JM]: On a recent episode, we talked about Discord's age verification and how I had canceled my account or at least submitted a request to cancel it because it takes 15 days. 09:01.104 --> 09:15.728 [JM]: And it looks like Mastodon is doing something similar, that they are taking steps to demonstrate their commitment to their values by moving as much of their digital infrastructure to free and open source software. 09:15.708 --> 09:18.532 [JM]: And this is a long-term thing that's going to take them a while. 09:18.913 --> 09:31.853 [JM]: But one of the steps they are making is moving from Discord to Zulip, which is a Python-based open source alternative that I mentioned during this episode that we talked about Discord. 09:32.274 --> 09:42.610 [JM]: Oh, and side note, just because it's timely, right before we started recording, I got a message from Discord saying, we are writing to let you know that this request of yours has been granted and you're 09:42.590 --> 09:44.775 [JM]: account has been deleted or something to that effect. 09:44.835 --> 09:45.977 [JM]: So it's now official. 09:46.418 --> 09:53.534 [JM]: And the last thing I'll say about recent Mastodon news is that it seems that they have chosen a slogan, which I kind of love. 09:53.654 --> 09:56.701 [JM]: And the slogan is, my friends are not for sale. 09:57.142 --> 09:59.687 [JM]: And when the Mastodon 09:59.667 --> 10:01.691 [JM]: Founder posted about this. 10:02.052 --> 10:04.236 [JM]: He said, you voted for this slogan and now it's here. 10:04.676 --> 10:15.256 [JM]: We have a new batch of super nice t-shirts that say my friends are not for sale because your connections to other people are more than a bargaining chip for big tech companies to keep you from leaving them. 10:15.637 --> 10:19.785 [JM]: Which I think is a nice explanation of what their slogan means. 10:19.765 --> 10:26.119 [DJ]: It's a pretty good slogan, but I really want everyone to know that, on the other hand, my friends are for sale. 10:26.360 --> 10:33.957 [DJ]: And if any of you guys want to be friends with Justin, I will sell you that privilege for only $100. 10:34.358 --> 10:37.926 [DJ]: So contact me via the Fediverse. 10:37.906 --> 10:39.310 [DJ]: Do I get a cut of that? 10:40.192 --> 10:41.316 [DJ]: We can absolutely not. 10:43.943 --> 10:45.226 [DJ]: We can set something up. 10:45.648 --> 10:47.413 [DJ]: So Justin, congratulations. 10:47.433 --> 10:48.837 [DJ]: You're about to have a bunch more friends. 10:49.258 --> 10:50.060 [JM]: Yay. 10:50.622 --> 10:52.928 [DJ]: And I'm about to be a couple of hundred bucks richer. 10:53.650 --> 10:54.631 [JM]: Well, this t-shirt is cool. 10:54.651 --> 10:59.438 [JM]: I think I'm going to pick up one of these and you can find a link to it in the show notes as usual. 10:59.498 --> 11:02.282 [JM]: All right, moving on to another story. 11:02.343 --> 11:11.476 [JM]: And that is that Simon Willison has posted yet another excellent post in which he breaks down the evolution of OpenAI's mission statement. 11:11.816 --> 11:22.071 [JM]: And this was a rather clever endeavor on his part because it seems like he went through the mandatory nonprofit tax returns that OpenAI received 11:22.051 --> 11:40.090 [JM]: is required to file, and then extracted their mission statement for 2016 through 2024, and then used an LLM to turn it into a Git repository in order to view the diffs or the differences between these mission statements over time. 11:40.530 --> 11:42.432 [JM]: And it really is a fascinating look. 11:42.732 --> 11:50.961 [JM]: And I am using fascinating in the most charitable way I could possibly use that word into how a company's 11:50.941 --> 11:53.484 [JM]: moral compass shifts over time. 11:54.004 --> 12:08.181 [JM]: Because in the beginning, there are all these lofty goals where they talk about advancing digital intelligence in a way that is most likely to benefit humanity as a whole, unconstrained by a need to generate financial return. 12:08.521 --> 12:15.649 [JM]: And talking about the importance of doing this safely and making sure the benefits are as widely and evenly distributed as possible. 12:15.629 --> 12:39.375 [JM]: And unsurprisingly, and as you can probably tell by the tone of how I'm explaining this, this just gets chipped away at over the years to the point where almost all of the high-mindedness has been sucked out of it into a sentence that's barely 10 words, which is OpenAI's mission is to ensure that artificial general intelligence benefits all of humanity. 12:39.355 --> 12:54.332 [JM]: And I'm really struck, just as a side note, this introduction of the word general, artificial general intelligence that does not appear in any of the other iterations, apparently before 2024, at least as far as I can tell. 12:54.632 --> 12:57.976 [JM]: Because arguably what we have right now is not that, right? 12:57.996 --> 13:06.265 [JM]: Like we don't have software that achieves what I would call human-like intelligence. 13:06.285 --> 13:09.328 [JM]: There are elements of it that are somewhat like humans, but... 13:09.308 --> 13:12.420 [JM]: I don't know, it's hard for me not to read this cynically and be like, okay, got it. 13:12.460 --> 13:20.128 [JM]: So what you're saying is that the thing you're currently delivering, which is not AGI, you don't really care whether that benefits all of humanity or not. 13:20.851 --> 13:23.417 [DJ]: Yeah, yeah, that doesn't need to benefit humanity. 13:23.457 --> 13:28.307 [DJ]: This other hypothetical thing that may be impossible, we want to make sure that benefits humanity. 13:28.528 --> 13:39.973 [DJ]: On the way there, though, but I for one am really excited to see the 2027 OpenAI mission statement, which is just going to be $10 signs all in a row and nothing else. 13:40.038 --> 13:45.208 [JM]: Each dollar sign representing the number of trillions of dollars they will have invested in this endeavor. 13:45.388 --> 13:46.290 [DJ]: Yeah. 13:46.410 --> 13:58.834 [DJ]: Each dollar sign representing like the 10% of all of the Earth's economic output that they have devoured in their pursuit of, I don't know, putting advertising in their chat app. 13:58.994 --> 14:01.759 [DJ]: What is it that they're doing now? 14:01.739 --> 14:02.139 [DJ]: Right. 14:02.540 --> 14:03.121 [DJ]: They are indeed. 14:03.141 --> 14:15.635 [DJ]: I can just see Sam Altman's little boy anime Twitter avatar crossing his arms in a huff at how unfair we're being to, I'm sure, the noble, high-minded mission of... 14:15.816 --> 14:17.878 [DJ]: I can't even bring myself to end this sentence. 14:19.680 --> 14:20.842 [DJ]: I'm so tired, Justin. 14:20.882 --> 14:25.367 [DJ]: I'm so tired of the depredations of these companies. 14:25.567 --> 14:43.154 [JM]: Understandably, I mean, their current mission statement mentions nothing about safety, nothing about we're going to do this without regard for generating financial returns, because, of course, that's not something that they care about anymore. 14:43.214 --> 14:47.580 [JM]: If it was indeed something that they cared about in the beginning, who knows? 14:48.001 --> 14:53.970 [JM]: But it's so hard for this not to feel like the whole don't be evil mantra that Google started out with. 14:53.950 --> 15:02.144 [JM]: And then decided that was too constraining and that they had outgrown it and discarded that as their mantra. 15:02.605 --> 15:13.444 [JM]: And when you've decided that you've outgrown or feel too constrained by don't be evil, I think once again, maybe it's time to check your moral compass. 15:13.424 --> 15:17.951 [JM]: I understand the idea that as humans, how does the saying go? 15:18.492 --> 15:22.239 [JM]: You either die a hero or live long enough to become the villain. 15:22.779 --> 15:39.567 [JM]: It seems that also applies to companies, I guess, where they start out with some degree of idealism and then just totally lose that along the way or decide it's too constraining or decide it's not a pirate anymore and they just become the villain. 15:39.749 --> 15:53.184 [DJ]: We've talked before about how one of the sort of meta problems in our, I don't know, civilization is this kind of the idea that there's kind of only one acceptable narrative, especially for companies and its growth period. 15:53.544 --> 16:03.175 [DJ]: And I think that leads to a lot of such second order effects because, I mean, it's funny to imagine this idea of just like you want people to like look in the mirror and 16:03.155 --> 16:06.099 [DJ]: And ask themselves, like, have I become the bad guy? 16:06.199 --> 16:15.753 [DJ]: And it's like, yeah, if you've decided that you have outgrown the concept of not being evil, then yes, definitively, you are now the villain in a James Bond film. 16:16.234 --> 16:17.335 [DJ]: Sorry, I don't make the rule. 16:17.576 --> 16:25.467 [DJ]: But I think the fact is most people don't wake up in the morning and decide that they don't care about whether they're evil or not anymore. 16:25.447 --> 16:42.204 [DJ]: Most people are slowly warped away from their ideals by the overpowering incentives that exist and the rhetoric around, well, we're changing our mission statement to better align blah, blah, blah with the blur to blur is a post hoc rationalization. 16:42.264 --> 16:51.353 [DJ]: But one of the things I think that is so pernicious about this mission statement thing and the reason that we're having a lot of fun calling it out is it's 16:51.333 --> 17:04.960 [DJ]: Open AI could have – I'm pretty sure it could have just not changed their mission statement ever and behaved exactly the same way because as far as I know, even though the – I think the IRS requires them to disclose this thing because they're a nonprofit corporation. 17:05.000 --> 17:10.351 [DJ]: I don't think the IRS audits whether they are actually behaving in line with their mission statement, right? 17:10.411 --> 17:13.257 [DJ]: Like that's something that companies are expected to do for themselves. 17:13.237 --> 17:22.663 [DJ]: So if they kept their mission statement the same and just behaved less and less aligned with it, then at least an external or internal party could point it out. 17:23.064 --> 17:28.579 [DJ]: I'm under the impression, having worked for many companies with mission statements, that's kind of the point of the mission statement. 17:28.559 --> 17:34.528 [DJ]: is to be able to occasionally check in and go, is our behavior actually aligned with what we say it should be? 17:34.889 --> 17:44.043 [DJ]: But the thing that feels really pernicious is when you see a company actually changing its mission statement away from ideals that you might think are good ones. 17:44.643 --> 17:52.255 [DJ]: Yeah, the don't be evil is a particularly extreme example because that's such a tautology. 17:52.235 --> 17:52.615 [DJ]: Right? 17:52.776 --> 17:56.160 [DJ]: It's like, I don't need to explain to you why you shouldn't be evil. 17:56.520 --> 17:58.923 [DJ]: Evil is a descriptor for things that are bad. 17:59.324 --> 18:01.086 [DJ]: Obviously, you shouldn't do those things. 18:01.386 --> 18:03.229 [DJ]: They are the set of things that should not be done. 18:03.589 --> 18:04.490 [DJ]: So don't be evil. 18:04.991 --> 18:05.251 [DJ]: Right? 18:05.271 --> 18:12.000 [DJ]: There's a lot there when a company decides, like, okay, our mission statement is no longer going to contain the language, don't be evil. 18:12.420 --> 18:17.907 [DJ]: It's pretty hard not to hear that and go, so you guys have decided it's not important not to be evil anymore? 18:17.927 --> 18:20.310 [DJ]: So, like, evil is on the table? 18:21.011 --> 18:21.111 [DJ]: Yeah. 18:21.091 --> 18:21.892 [DJ]: That's not great. 18:22.413 --> 18:32.764 [DJ]: The problem with changing the mission statement is it really feels like an indication that some – that like deliberation has gone on, that people haven't just like lost their way and could be reminded of it. 18:32.844 --> 18:35.648 [DJ]: Like, hey, guys, tap the sign meme. 18:35.688 --> 18:39.472 [DJ]: This was our mission statement and we haven't been acting in accordance with it. 18:39.832 --> 18:47.181 [DJ]: But instead the idea that they're like deliberately going, all right, well, you know that thing we said about making sure that stuff benefits all of humanity? 18:47.201 --> 18:47.301 [DJ]: Yeah. 18:47.618 --> 19:02.717 [DJ]: maybe it should only be like artificial general intelligence that we make sure benefits all humanity because like, you know, then we won't feel so bad about chat GPT telling people to harm themselves. 19:03.138 --> 19:10.047 [DJ]: Like, again, like that's probably like not really fair to any actual thing that's gone through the head of people who work at that company. 19:10.087 --> 19:11.549 [DJ]: But this is the outcome. 19:11.849 --> 19:13.291 [DJ]: It does feel like a canary. 19:13.311 --> 19:16.495 [DJ]: The metaphorical canary in the coal mine is when you see 19:16.475 --> 19:22.941 [DJ]: You see a company changing their mission statement and especially moving it away from these sort of high-minded ideals. 19:23.561 --> 19:29.467 [DJ]: That does not bode well, especially when we can also compare it to the actual observed behavior. 19:29.687 --> 19:40.357 [JM]: To be more charitable than I think Google deserves, I suppose I could understand the notion that, OK, well, evil is subjective. 19:41.077 --> 19:44.220 [JM]: And if we say don't be evil, then – 19:44.200 --> 19:51.991 [JM]: Anyone who disagrees with what we're doing can just say, oh, you say you're not evil, but here's this evil thing you're doing. 19:52.512 --> 20:00.743 [JM]: And so I guess it's vague and amorphous and unspecific enough that it could be applied to all kinds of different behaviors. 20:01.264 --> 20:07.913 [JM]: And maybe this was like some flaw they identified in their original mission statement and decided that, you know what, this isn't really serving us well. 20:07.893 --> 20:11.336 [JM]: Again, I think this is more charitable than Google deserves. 20:11.657 --> 20:16.161 [JM]: But let's just say that I'm granting that for the sake of this discussion. 20:16.181 --> 20:25.750 [JM]: I don't think I could extend that same latitude to OpenAI in this case because it was actually reasonably specific. 20:26.210 --> 20:35.419 [JM]: And they decided that having safety as a goal was too constraining and that trying to make this technology accessible 20:35.399 --> 20:40.670 [JM]: as widely and evenly distributed as possible, didn't fit their mission anymore. 20:41.411 --> 20:52.052 [JM]: And I don't really feel like there's much of any latitude I could extend to this series of decisions to do the opposite, right? 20:52.072 --> 20:55.399 [JM]: Like instead of taking a very vague initial decision, 20:55.379 --> 21:03.110 [JM]: mission statement and making it specific, they took a specific mission statement and made it so vague as to not really mean anything anymore. 21:03.531 --> 21:08.098 [JM]: This idea that it is to ensure that it benefits all of humanity. 21:08.418 --> 21:12.544 [JM]: And again, it being AGI, but whatever is so vague. 21:12.524 --> 21:18.490 [JM]: that it's hard to call them out for doing something that's violating their mission statement because it doesn't mean anything anymore. 21:18.510 --> 21:26.658 [JM]: And side note, this idea that OpenAI is a nonprofit is not really true if I understand it correctly. 21:26.678 --> 21:32.984 [JM]: It has a nonprofit component and it has a massive for-profit component. 21:33.285 --> 21:41.653 [JM]: There are two parts of this company and I think the nonprofit part of it is increasingly irrelevant. 21:41.633 --> 21:47.144 [DJ]: Maybe that explains why the nonprofit part of its mission statement is getting increasingly vague. 21:47.384 --> 21:51.212 [DJ]: You did make a good point, though, too, about those things going in opposite directions. 21:51.252 --> 21:52.114 [DJ]: And it is true. 21:52.134 --> 21:55.340 [DJ]: Like, Google is obnoxious for lots of reasons. 21:55.520 --> 22:01.813 [DJ]: And so, like, we're just going to put them on blast about their stupid mission statement thing, and they're just going to have to live with that. 22:01.833 --> 22:03.456 [DJ]: But it is a fair point to— 22:03.436 --> 22:11.245 [DJ]: to actually look at it and go, okay, well, don't be evil is too vague and too obvious to actually mean anything in practice. 22:11.345 --> 22:26.903 [DJ]: Like I could indeed see a company starting there and then going, okay, but seriously, we actually need a mission statement that we can use to say something about our actual values because it is actually a little bit of a cop-out being like, oh, well, my values, don't be evil. 22:27.103 --> 22:30.507 [DJ]: It's like, right, the bar's on the floor with that one. 22:30.487 --> 22:32.350 [DJ]: Obviously, no one should be evil. 22:32.590 --> 22:34.192 [DJ]: So what else are you about? 22:34.793 --> 22:39.619 [DJ]: That is actually, I think, a fair reason for them to rethink that. 22:39.760 --> 22:49.693 [DJ]: Whereas, yeah, it seems like open AI is, at least with this part of it, is going in the opposite direction where they had something more specific and it's just getting more and more watered down. 22:49.673 --> 23:04.066 [DJ]: Well, and more and more, like more and more turning into the form of that sort of like corporate doublespeak that we all are annoyed by and tend to make fun of where you just kind of say a lot of words that don't really map onto anything concrete. 23:04.486 --> 23:11.933 [DJ]: And as much as I think benefiting all of humanity is, again, like a noble ideal, how are you going to measure that, right? 23:11.953 --> 23:17.638 [DJ]: Like how are you going to sit down every year and figure out if what you did benefited all of humanity or did not? 23:17.618 --> 23:25.667 [DJ]: Frankly, it kind of seems like the sort of thing that is so perhaps unachievable that then it lets you off the hook for achieving anything. 23:25.868 --> 23:28.671 [DJ]: Because if you just go like, well, we're trying to benefit all of humanity. 23:28.691 --> 23:29.412 [DJ]: It's like, well, did you? 23:29.472 --> 23:30.893 [DJ]: It's like, well, no, of course not. 23:30.954 --> 23:32.215 [DJ]: But that's impossible. 23:32.375 --> 23:35.979 [DJ]: So therefore, like, you don't have to hold us accountable for anything we do. 23:36.380 --> 23:36.600 [DJ]: Right. 23:36.740 --> 23:40.825 [DJ]: I think that's like that's a fair argument against using language like that. 23:40.805 --> 24:03.695 [JM]: I can't help but just feel like I want them just to be actually honest about it, which is we want to create the world's largest software company that will gradually replace all of the other world's software and create an entity that has a market capitalization larger than all of the other companies in the world combined. 24:04.175 --> 24:07.800 [DJ]: I was going to say, we want to find out what comes after Trillion. 24:08.501 --> 24:08.601 Yeah. 24:09.003 --> 24:13.050 [JM]: I can't wait till Sam Altman starts throwing out the word quadrillion. 24:13.550 --> 24:14.652 [DJ]: Yeah, really? 24:14.672 --> 24:15.974 [DJ]: All right. 24:16.095 --> 24:21.163 [DJ]: But OpenAI is not the only large language model company on the block. 24:21.183 --> 24:26.572 [DJ]: There are others, and fortunately, they are not nearly as bad. 24:26.612 --> 24:27.714 [DJ]: Are they, Justin? 24:28.195 --> 24:29.817 [DJ]: What about Anthropic, for example? 24:29.877 --> 24:31.500 [DJ]: Anthropic's good, right? 24:31.480 --> 24:34.923 [JM]: I don't know if I would use the word good, but better maybe? 24:35.124 --> 24:35.744 [JM]: I don't know. 24:35.784 --> 24:53.562 [JM]: They have recently raised $30 billion in a Series G funding round at a post-money valuation of $380 billion, which is significantly less than OpenAI's. 24:53.582 --> 25:01.150 [JM]: I don't even know what their current valuation is estimated at, but probably in the trillions? 25:01.636 --> 25:26.366 [JM]: But yeah, increasingly, it seems like, as we have talked about before, there are this small number of companies pouring unholy, well, raising unholy amounts of money and then pouring those unholy sums into buying up all the RAM and hard drives and solid state drives and all the other things that we can't afford or buy anymore, as we have discussed. 25:26.767 --> 25:26.867 Yeah. 25:26.847 --> 25:35.478 [JM]: But another thing that all of this activity is doing is it is having an impact on the open source world. 25:35.939 --> 25:49.497 [JM]: Last time we talked about some rogue OpenClaw account that submitted some pull request to an open source repository that was politely declined, and then that OpenClaw bot 25:49.477 --> 25:55.427 [JM]: wrote some blog posts, some screed, decrying this supposed act of gatekeeping. 25:55.467 --> 26:03.261 [JM]: And we are starting to see other examples of how generative software is having an effect on open source. 26:03.621 --> 26:06.827 [JM]: An example of this that has made the news recently is 26:06.807 --> 26:25.677 [JM]: Some generative software agent created a GitHub account a couple of weeks ago and in the span of two weeks opened one hundred and three pull requests across ninety five repositories and got some of that code merged into notable software projects. 26:25.657 --> 26:40.697 [JM]: and then subsequently started reaching out to other open source maintainers, offering to contribute, and then using the pull requests that got merged as proof or currency that their contributions would be worthwhile. 26:41.037 --> 26:55.596 [JM]: And at no point does this bot talk about the fact that it's a bot or that it is doing all of this in order to try to get people to notice its commercial website and to try to make money off of this whole flurry of automated transactions 26:55.576 --> 26:57.879 [JM]: code generation and submission. 26:58.419 --> 27:11.275 [JM]: And it's crazy to me that this code got merged into projects that are widely used and sometimes are important parts of our modern software ecosystem. 27:11.415 --> 27:15.981 [JM]: This is infrastructure that a lot of our other software is built on top of. 27:15.961 --> 27:32.724 [JM]: And it's reminiscent of something that happened a while ago with a project called XZ, which is a compression library where someone played this really long game over the span of years where small, useful contributions were made. 27:33.124 --> 27:35.067 [JM]: Credibility was built up over time. 27:35.047 --> 27:49.445 [JM]: And then once this person had built up that credibility, they used it to submit a secret and hidden backdoor into the project that was merged and only was discovered really by mistake. 27:49.505 --> 27:58.776 [JM]: Someone realized that there was a performance regression, that this compression library was a little bit slower than it was before to the tune of milliseconds. 27:58.756 --> 28:05.267 [JM]: And when investigating why it had gotten slower, it was only then by chance that this backdoor was discovered. 28:05.648 --> 28:07.992 [JM]: That took years and a human did it. 28:08.333 --> 28:13.461 [JM]: This took two weeks and it was apparently mostly, if not fully automated. 28:13.802 --> 28:18.270 [JM]: And I've seen this latter pattern described as reputation farming at scale. 28:18.630 --> 28:21.355 [JM]: And that seems like a pretty good description. 28:21.335 --> 28:26.485 [JM]: Because it is so much easier to manufacture trust in this way. 28:26.545 --> 28:34.059 [JM]: And it seems like, as we've talked about before, that the Open Claw project is being, I would say, abused. 28:34.479 --> 28:41.773 [JM]: Because I don't think this type of behavior is the original intention of its creator or any of the people that originally contributed to it. 28:41.753 --> 28:48.724 [JM]: But if we have learned anything about the internet, it is that people will find ways of abusing the tools that we make available to them. 28:49.125 --> 28:57.117 [JM]: Side note, I remember when I mentioned before about OpenClaw's rapid popularity and how it had reached this crazy number of GitHub stars. 28:57.458 --> 29:01.063 [JM]: Because I couldn't resist, I looked it up right before we started recording. 29:01.584 --> 29:06.732 [JM]: I don't even think they were on the top 100 list when I originally started. 29:06.712 --> 29:09.738 [JM]: wanted to see where they were on this ranking. 29:10.159 --> 29:25.969 [JM]: With a star count now of 224,000, they are number 14 on the list of top star count repositories, which, by the way, is above Python and Linux. 29:25.949 --> 29:27.571 [JM]: which to me is just bananas. 29:27.971 --> 29:31.454 [JM]: Andrew Nesbitt has written an article called Respectful Open Source. 29:31.975 --> 29:46.309 [JM]: And the genesis for the post is that this person found a bug in a popular open source project and had the full intention of submitting it to the repository so that the maintainer could take a look at it and merge it if appropriate. 29:46.809 --> 29:54.977 [JM]: And when this person went to the repository and saw that the maintainer was just drowning in a sea of issues and pull requests, 29:54.957 --> 30:03.853 [JM]: decided not to submit the fix for this problem because I didn't want to contribute to what appeared to be an overwhelming situation for the maintainer. 30:03.913 --> 30:18.940 [JM]: And I think one of the more interesting things about this article is this discussion of how GitHub both enabled all of this collaboration in open source to occur and then simultaneously 30:18.920 --> 30:22.666 [JM]: made it easy for open source maintainers to become overwhelmed. 30:23.007 --> 30:35.848 [JM]: And that a better way to have architected or at least promoted how GitHub handles the collaboration would be to improve fork discovery so that you can see easily 30:35.828 --> 30:38.211 [JM]: What are some forks that people have created? 30:38.271 --> 30:42.095 [JM]: And what are the things that they've done that are valuable in those forks? 30:42.495 --> 30:49.143 [JM]: Because at that point, it could be something that the maintainer kind of pulls into their awareness and their attention. 30:49.183 --> 30:51.685 [JM]: And they can work on that as they see fit. 30:51.705 --> 30:57.832 [JM]: But instead, we have this model where everyone pushes their contributions at the maintainers. 30:58.313 --> 31:01.296 [JM]: And this is highlighted as a big source of this problem. 31:01.276 --> 31:05.224 [JM]: And so in the end, this person says, OK, well, the fix is on my fork. 31:05.565 --> 31:16.347 [JM]: And if the maintainer happens to discover it, even though, again, GitHub has made fork contribution discovery harder than perhaps it could be, then they can find it and use it. 31:16.447 --> 31:20.095 [JM]: But presumably they won't because of the aforementioned discoverability problem. 31:20.075 --> 31:44.800 [JM]: And going back to the topic at hand, this problem is only going to accelerate wildly because of all of these agents that are being unleashed and just automating all of this stuff to the point where maintainers are considering just turning off the whole pull request ability, something that GitHub, I think, somewhat reluctantly does. 31:44.780 --> 31:55.999 [JM]: is now talking about enabling or maybe has already enabled as an option for maintainers just to give them essentially a ripcord to say like, I can't manage this anymore. 31:56.059 --> 32:03.852 [JM]: I am hereby cutting off all pull request submissions as a way of contributing to this repository. 32:03.832 --> 32:11.812 [JM]: And I think we talked before about this idea of captchas being this original idea of here's how we can prove that we're human. 32:11.852 --> 32:15.862 [JM]: And of course, we live in an era where those are effectively useless. 32:16.463 --> 32:21.115 [JM]: And there really isn't a good way to prove that you're a human. 32:21.095 --> 32:28.646 [JM]: There's no way for a open source maintainer to say, OK, prove you're a person, prove you're a human, prove you are not a bot. 32:29.026 --> 32:37.599 [JM]: And so several projects have surfaced that try to address this problem, one of which is called Vouch by Mitchell Hashimoto. 32:38.079 --> 32:48.334 [JM]: And the idea is that this essentially creates a web of trust for a open source project where trusted people vouch for other trusted people. 32:48.314 --> 32:55.906 [JM]: And people who have not been vouched for cannot contribute to the project when this is implemented. 32:56.086 --> 33:06.643 [JM]: People who cause problems for a given repository can be denounced and effectively banned via this same system, and they can be blocked. 33:06.623 --> 33:14.333 [JM]: And one of my original concerns with this project is that I got the impression that it was tied to GitHub. 33:14.373 --> 33:23.664 [JM]: But from what I can understand, it is explicitly not dependent on GitHub and can work with other code forges. 33:24.105 --> 33:27.729 [JM]: And that seems like an important thing for a project like this to have. 33:27.749 --> 33:30.673 [JM]: It shouldn't, in my opinion, be tied to a particular project. 33:30.653 --> 33:31.374 [JM]: CodeForge. 33:31.895 --> 33:40.548 [JM]: And Mitchell created this project in part to actively deploy it and use it for the terminal emulator project that he's been working on. 33:41.069 --> 33:43.393 [JM]: I came across another one called Good Egg. 33:43.814 --> 33:49.583 [JM]: And this one does require GitHub, it is very much centered on GitHub. 33:49.943 --> 33:56.273 [JM]: And one of the key differences with good egg is that it is meant to be an automatic check. 33:56.253 --> 34:02.941 [JM]: It is not a human web of trust where one group of humans is vouching for others. 34:03.441 --> 34:19.940 [JM]: It is meant to look at a person's GitHub account and their behavior and determine in the context of a given repository, say yours as a maintainer, whether or not you should have high confidence that they will be a, as they call it, good egg. 34:20.341 --> 34:25.787 [JM]: And for fun, I ran this tool against my own GitHub account and using... 34:25.767 --> 34:31.454 [JM]: an open source project that I maintain called Pelican as the context for this analysis. 34:31.915 --> 34:37.902 [JM]: And it came back as good egg score of high with a 96% rating. 34:38.343 --> 34:44.030 [JM]: And I guess this is good in that it scales better than trying to get humans to vouch for other humans. 34:44.450 --> 34:49.817 [JM]: But I do wonder whether this kind of thing can be gamed at scale. 34:50.118 --> 34:52.845 [JM]: And I suppose time will tell whether it can be gamed. 34:53.105 --> 35:07.300 [JM]: Given what we've seen that these software bots can do, I do wonder whether this can be bypassed essentially in terms of getting a high score that isn't actually indicative of someone that's a human. 35:07.280 --> 35:12.192 [JM]: I remember back in the day, there was such a thing as key signing parties. 35:12.473 --> 35:25.103 [JM]: Back when GPG, which became PGP for pretty good privacy, people would get together and meet each other in person and sign each other's encryption keys. 35:25.083 --> 35:27.328 [DJ]: This never really happened, Justin. 35:27.990 --> 35:29.754 [DJ]: It absolutely happened. 35:30.236 --> 35:36.912 [DJ]: I've seen stuff on the internet, by the way, if PGP is pretty good privacy, I assume GPG is good privacy good. 35:39.318 --> 35:41.944 [JM]: Unsurprisingly, no, that is not what it stands for. 35:41.924 --> 35:50.455 [DJ]: Whenever I've seen people talk about it, it's like people can have these key signing parties and you'll get together in person and trade your PGP keys to show that you trust each other. 35:50.475 --> 35:52.698 [DJ]: I remember reading that and thinking like, oh, okay. 35:52.718 --> 35:58.746 [DJ]: So this happened on the Stanford campus once in like 1989 and it's never happened again. 35:58.786 --> 36:03.192 [DJ]: But honestly, how many key signing parties do you think there have ever been? 36:03.252 --> 36:04.113 [DJ]: Period. 36:04.394 --> 36:04.594 [DJ]: Ever. 36:04.874 --> 36:06.517 [DJ]: In the entire history of humanity. 36:06.697 --> 36:07.117 [DJ]: Anywhere. 36:07.418 --> 36:08.379 [DJ]: It's got to be under 10. 36:09.080 --> 36:09.180 Yeah. 36:09.160 --> 36:11.222 [JM]: My guess is that it's over 10. 36:11.743 --> 36:12.864 [JM]: But I get your point. 36:13.424 --> 36:20.231 [JM]: And yes, this was in terms of the number of humans relative to the population of the planet. 36:20.291 --> 36:26.437 [JM]: This was a tiny, tiny fraction of humans, relatively speaking, that did this activity. 36:26.517 --> 36:28.299 [JM]: Yes, I get your point. 36:28.379 --> 36:29.400 [JM]: And I already knew that. 36:29.820 --> 36:34.105 [JM]: But I do wonder whether this is where some of this is headed, right? 36:34.185 --> 36:34.425 [JM]: Like, 36:34.405 --> 36:49.684 [JM]: One could imagine at software conferences, people getting together and essentially having some similar web of trust system where it's like, okay, well, I'm talking to you in a room and I'm in physical space with you. 36:49.744 --> 36:51.148 [JM]: So I know you are a human. 36:51.128 --> 36:52.871 [JM]: I wonder if it's going to come down to that. 36:53.272 --> 37:02.710 [DJ]: Part of the problem there is it doesn't matter because like people have been meeting each other in rooms and then screwing over the people they met in those rooms for as long as human beings have existed. 37:03.071 --> 37:09.002 [DJ]: So like merely meeting a person does not prove anything about their intentions. 37:08.982 --> 37:19.916 [DJ]: One of the things that sticks out to me about the, to some degree, the whole, how can we prove that the people contributing to this open source project are humans and not somebody's open claw instance? 37:19.936 --> 37:24.662 [DJ]: As you demonstrated through example, like humans can do bad stuff to your repository too. 37:25.043 --> 37:28.127 [DJ]: Like the fact that a PR came from a human doesn't mean it's good. 37:28.327 --> 37:31.471 [DJ]: And the fact that a PR came from a robot doesn't mean it's bad. 37:31.852 --> 37:31.952 Yeah. 37:31.932 --> 37:35.340 [DJ]: I guess the real problem is like the massive of contributions. 37:35.740 --> 37:38.186 [DJ]: And the more contributions you get, the harder it is to review them. 37:38.687 --> 37:49.030 [DJ]: And the harder it is to review code effectively, the more likely it is that whether human or robot and whether through malfeasance or incompetence, some bad code gets into your system. 37:49.010 --> 37:53.843 [DJ]: I guess I'm just not that bullish that these tools are going to help that problem that much. 37:54.284 --> 37:59.017 [DJ]: Or maybe to put it another way, like, the best hope is better tooling to detect bad code. 37:59.478 --> 38:03.489 [DJ]: The backdoor in XZ, I'm Canadian, I should call it XZ. 38:03.870 --> 38:04.572 [DJ]: So... 38:04.552 --> 38:16.841 [DJ]: It is rather chilling to hear about a person who apparently deliberately like spent a long time making everyone think they were good so that they could then deliberately do something bad without anyone noticing. 38:17.382 --> 38:21.632 [DJ]: And you were talking about how the back door they entered was discovered essentially by accident. 38:21.612 --> 38:34.670 [DJ]: I guess what occurs to me is like maybe the best bet is to try to get better at detecting backdoors because in my, I don't know, perhaps jaded worldview, there is essentially no way to effectively prevent that type of behavior. 38:34.950 --> 38:37.153 [JM]: I don't think it's about preventing it. 38:37.614 --> 38:47.788 [JM]: And I don't know that trying to detect bad code is something that really scales very well or might be feasible. 38:47.853 --> 38:55.775 [DJ]: Yeah, but I guess my point is that detecting whether you should accept this PR in the first place or not doesn't scale well, nor is it feasible either. 38:55.815 --> 38:56.437 [JM]: For sure. 38:56.577 --> 39:00.548 [JM]: I think the point of the detection, like, hey, are you a human or not? 39:00.749 --> 39:02.554 [JM]: The point of that is to... 39:02.720 --> 39:06.347 [JM]: essentially put like a valve on the fire hose. 39:06.588 --> 39:09.594 [JM]: The idea is to stop the flooding, right? 39:10.014 --> 39:18.872 [JM]: By stopping the flooding, then you're able to say like, okay, well, because like, look, the alternatives as we're already talking about is just saying like, okay, that's it. 39:18.852 --> 39:20.915 [JM]: No more submissions to this project. 39:21.155 --> 39:29.187 [JM]: You are either one of the core developers and you get to contribute to this project or you aren't and you don't. 39:29.307 --> 39:31.170 [JM]: And there are projects that are run this way, by the way. 39:31.550 --> 39:40.102 [JM]: There are open source projects that receive, that accept, I should say, zero outside contributions as a matter of policy. 39:40.383 --> 39:41.925 [JM]: They simply do not accept them. 39:41.945 --> 39:44.489 [JM]: And that's all fine and good if that's your intention. 39:44.949 --> 39:47.593 [JM]: But if your intention is, no, actually, we want to welcome people 39:47.573 --> 39:50.798 [JM]: We want to welcome new contributors to this project. 39:51.159 --> 39:56.628 [JM]: We just can't because of this flood of automated spam, essentially. 39:56.648 --> 40:04.500 [JM]: Again, even if the contributions themselves are useful, it doesn't matter if there are too many to review. 40:04.540 --> 40:12.433 [JM]: If you cannot separate the good contributions from the bad ones because you are just drowning under a flood of 40:12.413 --> 40:13.395 [JM]: contributions. 40:13.715 --> 40:15.758 [JM]: It doesn't matter that there are good ones in there. 40:16.199 --> 40:25.614 [JM]: So I think the intention of saying like, we need to find out whether contributions are coming from a person is just to manage the flow. 40:25.714 --> 40:27.156 [JM]: It's a way of saying like, okay. 40:27.197 --> 40:28.699 [JM]: And it's also another signal, right? 40:28.919 --> 40:31.824 [JM]: It gives you some information as we've talked about before. 40:32.225 --> 40:35.670 [JM]: It just gives you some information that you can use 40:35.650 --> 40:41.082 [JM]: In addition to just reducing the flow, it gives you some information you didn't have before, right? 40:41.503 --> 40:45.231 [JM]: Whether you choose to act on that or not, it's up to you as the maintainer. 40:45.612 --> 40:52.748 [JM]: But without knowing it, you're just less well-equipped, I think, to decide how you want to act on something that lands in your inbox. 40:52.728 --> 40:54.090 [DJ]: Yeah, that makes a lot of sense. 40:54.471 --> 40:58.758 [DJ]: What you're saying, ironically, makes me think that maybe key signing parties are the future. 40:59.118 --> 41:04.808 [DJ]: Because something that's occurring to me as we discuss this is nothing scales, really. 41:04.828 --> 41:06.791 [DJ]: Not the way that we often think it does. 41:07.091 --> 41:09.736 [DJ]: Humans aren't made to scale to a certain extent. 41:09.756 --> 41:12.120 [DJ]: And so I think you kind of mentioned this before. 41:12.761 --> 41:18.550 [DJ]: One of the tricky things about GitHub, even before agentic code generator bots existed, 41:18.530 --> 41:25.804 [DJ]: was the fact that you could start a project and anyone in the entire world could contribute to it. 41:25.964 --> 41:31.815 [DJ]: That's already fundamentally not scalable in terms of things like trust and in terms of things like workload. 41:32.176 --> 41:36.825 [DJ]: So I wonder now that in certain domains and open source contributions is one of them, 41:36.805 --> 41:53.278 [DJ]: generative so-called AI is able to produce at such a higher scale than even all the humans on the internet could, if what we're going to see is a contraction back into like much smaller, tighter communities, essentially. 41:53.258 --> 42:13.313 [DJ]: Where it maybe it becomes the new norm that when someone maintains an open source project, they only do ever accept contributions from a small number of people, the number of people that they could meet, whether in person or not, but that they could actually develop enough of a trustworthy relationship with that they would go, okay, you can contribute to my project. 42:13.333 --> 42:15.216 [DJ]: That might just have to become the new norm. 42:15.432 --> 42:30.454 [JM]: I don't see any way for it not to become the new norm, because as I mentioned in a previous episode, I feel like this is just this tiny, tiny grain of sand on the iceberg. 42:30.434 --> 42:36.242 [JM]: Sorry, a tiny, tiny grain of sand on the tip of this massive iceberg. 42:36.262 --> 42:39.947 [JM]: We're just seeing this like little tiny window into our future. 42:40.468 --> 42:55.328 [JM]: So I just don't see any other outcome other than people just restricting a lot of different things going forward in order to combat this wave of automated everything. 42:55.308 --> 42:59.492 [DJ]: Yeah, wave of automated everything does seem like a good way to sum it up. 42:59.512 --> 43:04.256 [DJ]: I kind of feel good about the notion of people in a bunch of different domains, not just software development. 43:04.657 --> 43:23.374 [DJ]: Maybe going back to smaller, more deliberately created forms of community, not that those don't have their own problems, but one of the challenges of the internet age is that human beings really were not meant to collaborate, interact, et cetera, with each other at massive scale. 43:23.354 --> 43:30.364 [DJ]: So I like I don't think the future has ever been this notion of like each one of us is connected to every other one of us. 43:30.725 --> 43:39.036 [DJ]: I think finding the small communities, whether that's about software development or some other thing that you fit in is the way forward. 43:39.157 --> 43:49.952 [DJ]: And the more the more of these weird quasi human like behaviors that we see being performed by software in the form of generative software, etc., 43:49.932 --> 43:58.108 [DJ]: The more it makes me think, like, hmm, I just kind of want to find the 10 people in the world that I want to interact with, and I'm just going to cling to them. 43:58.609 --> 44:01.996 [DJ]: But I also don't really think there's anything wrong with that. 44:02.377 --> 44:06.625 [JM]: I think you might be right that the future does perhaps... 44:06.605 --> 44:14.714 [JM]: involve an increasing number of people who prioritize in-person interactions with other human beings. 44:15.295 --> 44:29.811 [JM]: Even though we seemingly are addicted to screens more and more, perhaps somehow we buck that trend and we find ways of prioritizing to spend more time in person with the people we care about. 44:29.791 --> 44:31.314 [JM]: All right, that's all for this episode. 44:31.334 --> 44:32.556 [JM]: Thanks everyone for listening. 44:32.657 --> 44:38.388 [JM]: You can find me on the web at justinmayer.com and you can find Dan on the web at danj.ca. 44:38.788 --> 44:43.277 [JM]: Reach out with your thoughts about this episode via the Fediverse at justin.ramble.space.