WEBVTT

00:00:00.000 --> 00:00:04.000
Going down. Do we have? Do we have anybody set up as as co-hosts?

00:00:04.000 --> 00:00:19.000
Book in, please, Mr. Gary. Good evening, everybody. This is the St. Louis Linux Unix user group. It's our general monthly meeting, and tonight is Wednesday, April the 8th, 2026.

00:00:19.000 --> 00:00:37.000
Our presentation tonight is going to be. from Scott Granaman, former vice president of this group, and Jan Spartan. Both of them are the principals at Web Sanity, and their topic tonight is going to be.

00:00:37.000 --> 00:00:45.000
Artificial intelligence, basically the use of it to generate code.

00:00:45.000 --> 00:00:50.000
So looking forward to that. Before we do that, we're going to have a few.

00:00:50.000 --> 00:01:01.000
introductory notes and and then also a very brief period of time for some Q&A. And.

00:01:01.000 --> 00:01:11.000
Just to see if anybody's got any questions they really need answered early in the evening. There'll also be time after the presentation tonight to delve into this general Q&A as well.

00:01:11.000 --> 00:01:24.000
So with that, let's see. San, do we have… the overhead, the slides.

00:01:24.000 --> 00:01:25.000
Well, I'm looking at them. I don't know what you're looking at.

00:01:25.000 --> 00:01:31.000
Stay in. Not the slide. Yeah, we don't see any slides.

00:01:31.000 --> 00:01:34.000
Oh, a bitch.

00:01:34.000 --> 00:01:50.000
Well, Stan is working on that next week on Thursday we have Robert Satek, one of our long-term members. He's going to do a presentation for us on DuckDB.

00:01:50.000 --> 00:02:02.000
And he did a presentation on DuckDB back in November. This one will be kind of a different view of it. Basically, it's specifically DuckDB.

00:02:02.000 --> 00:02:16.000
to organize the U.S. Census data and which of course if you've ever looked at it, there's a lot of data there and he also may get into using it for.

00:02:16.000 --> 00:02:21.000
geological data as well.

00:02:21.000 --> 00:02:24.000
close geospatial data, but close enough.

00:02:24.000 --> 00:02:26.000
Thank you, geospatial data. Thank you. That was the voice of Robert.

00:02:26.000 --> 00:02:29.000
God damn it.

00:02:29.000 --> 00:02:31.000
That was not… that was Stan.

00:02:31.000 --> 00:02:36.000
Oh.

00:02:36.000 --> 00:02:38.000
I give up. I totally give up. I mean, I just.

00:02:38.000 --> 00:02:52.000
Okay, we'll cut it. Kind of doing it from memory, then. By the way, that meeting is next week on Thursday. It starts at 6 30. And.

00:02:52.000 --> 00:02:57.000
Probably goes till maybe 9, but probably less. So.

00:02:57.000 --> 00:03:05.000
Let's see other meetings. Stan, do you remember what day the new user group is?

00:03:05.000 --> 00:03:09.000
I don't know what I had for breakfast, and you asked me a question like that?

00:03:09.000 --> 00:03:16.000
I know. Well, you know, the nice thing is, is I can just call up the calendar.

00:03:16.000 --> 00:03:19.000
The calendar has all of that on it. For those who have it.

00:03:19.000 --> 00:03:24.000
Oh, wow, Gary is actually using the computer. How about that? Oh my.

00:03:24.000 --> 00:03:29.000
For those of you who haven't used it, if you go to slug.org.

00:03:29.000 --> 00:03:48.000
slash calendar. It will bring up. Both our own events, the ones that we sponsor and do. There's also a subdivision of it which do St. Louis local events.

00:03:48.000 --> 00:03:59.000
And another subdivision that does remote and online events. So you can see all of that together at one time, or you can divide it up into the different ones.

00:03:59.000 --> 00:04:12.000
So with that, I'm going to peek. and see when our… Local meeting, so…

00:04:12.000 --> 00:04:17.000
The main Lennox meeting should be, uh, 8 days from now.

00:04:17.000 --> 00:04:18.000
Shit.

00:04:18.000 --> 00:04:23.000
There we go.

00:04:23.000 --> 00:04:31.000
Thank you very much. Appreciate that clue.

00:04:31.000 --> 00:04:37.000
Let's see.

00:04:37.000 --> 00:04:43.000
And so it would be the the new user one 8 days from now. No.

00:04:43.000 --> 00:04:51.000
Eight days from now is the Linux group.

00:04:51.000 --> 00:05:02.000
Anyway, as simple as if you use both the calendar. You can see everything at once. Yeah, yeah.

00:05:02.000 --> 00:05:12.000
Okay.

00:05:12.000 --> 00:05:16.000
World Penguin Day is the 25th of the month.

00:05:16.000 --> 00:05:22.000
The new Linux user group is on Tuesday the 28th.

00:05:22.000 --> 00:05:46.000
The new Linux group. It. It's not exclusively. You don't have to be a new user. In fact, it's kind of fun for a familiar user to be there. But the the rule is, whoever considers themselves to be the the newest user of Linux gets to ask the 1st question and kind of push that to.

00:05:46.000 --> 00:05:51.000
to an answer. And then the next most new user.

00:05:51.000 --> 00:06:03.000
But I always encourage the gray beards, so to speak, in the group. Yeah, it's actually very informative to go there, because you find out what new users are looking for.

00:06:03.000 --> 00:06:18.000
And the amazing thing is, is, you know, somebody will say, Stan will say one thing and Phil will say, you know, yeah, but I do it this way and I'll be there taking notes and going, you know, wow, I didn't know you could do it either of those two ways.

00:06:18.000 --> 00:06:25.000
Um, so yeah, you, you… It's amazing sometimes what you find out from other experienced users.

00:06:25.000 --> 00:06:41.000
So I encourage you all to go on the 7th of May, it's a Thursday, and that is the St. Louis Area Computer Club. That is primarily a hardware group. So yes.

00:06:41.000 --> 00:06:54.000
even discussed proprietary operating systems like Microsoft. So yeah, I can ask the whole range of questions there.

00:06:54.000 --> 00:07:05.000
Let's see. We did have our elections last month. They were delayed because of some problems with the mail, and the.

00:07:05.000 --> 00:07:12.000
the result was basically the incumbents were reelected. Lee Lamert was elected to another 3-year term on the board.

00:07:12.000 --> 00:07:17.000
And Grant was elected to another three-year term on the board.

00:07:17.000 --> 00:07:28.000
So, and they'll get around to appointing offices. Speaking of offices or office holders, if anybody would care to.

00:07:28.000 --> 00:07:43.000
Volunteer to be kind of in charge of public relations. We've had some people in the past, but they've been drawn off into other things. But yeah, just somebody to put our meetings on the calendar, send out the discuss or announce.

00:07:43.000 --> 00:07:55.000
information when we get a meeting set up. Uh, but yeah, we could use, uh, we could use some helpful fingers on that. So, uh, anybody want to volunteer? Contact me.

00:07:55.000 --> 00:08:06.000
Um, let's see what else do we have? Any changes in the calendar or something that, uh… You should do.

00:08:06.000 --> 00:08:12.000
Will you know of anything? No, not off the top. Okay.

00:08:12.000 --> 00:08:21.000
Let's see… I think it probably covers all the categories unless somebody knows or something.

00:08:21.000 --> 00:08:24.000
Now, considering the popularity of the subject. I mean.

00:08:24.000 --> 00:08:38.000
Probably want to jump into it. Yeah, yeah. Let's see anything that you can think of, Stan, that I've left out?

00:08:38.000 --> 00:08:43.000
Stan, we see your lips moving, but we don't hear your voice.

00:08:43.000 --> 00:08:53.000
You're muted, Stan.

00:08:53.000 --> 00:08:58.000
Thank you.

00:08:58.000 --> 00:09:06.000
and we heard the thank you.

00:09:06.000 --> 00:09:10.000
I hate Zoom. I hate Zoom.

00:09:10.000 --> 00:09:13.000
as do we all. Anything to add, Stan? Okay, that's a good ad.

00:09:13.000 --> 00:09:17.000
I hate Zoom. Oh, oh, yeah, I almost forgot. I hate meetup horse.

00:09:17.000 --> 00:09:23.000
Okay. There we go.

00:09:23.000 --> 00:09:31.000
Okay, let's see. Let's see, our speakers tonight.

00:09:31.000 --> 00:09:43.000
Uh, we got Scott Granaman, former vice president of the group. He's waving to us there. And Jan Carton. Both of them are the principals at Web Sanity.

00:09:43.000 --> 00:09:52.000
And they've been excellent speakers for us for quite some time. Scott.

00:09:52.000 --> 00:10:08.000
totally surprised us, what, about 2 years ago? 2022? Maybe. And he said, Okay, guys, hold on artificial intelligence is just going to fly. I mean, it's going to grow like crazy.

00:10:08.000 --> 00:10:15.000
Wonderful presentation, and he was right. Everything just took off a couple months after he warned us that that's what's going to happen.

00:10:15.000 --> 00:10:25.000
His presentation was in November and in January. It was a floodgate. Everybody else in the world must have seen Scott's presentation.

00:10:25.000 --> 00:10:44.000
when Scott's not working at web sanity, he's an adjunct professor. He's teaching at Webster University, and at St. Charles Community College right now, and he's taught at wash you and other places around town in the past.

00:10:44.000 --> 00:10:54.000
So, uh, with that, uh… Oh, I guess I did say if anybody really wants to ask a question up front on just a general topic.

00:10:54.000 --> 00:11:01.000
Uh, now's the time. I will make a really quick observation.

00:11:01.000 --> 00:11:03.000
You mean a call for help?

00:11:03.000 --> 00:11:14.000
Yeah, call for help. Let me make a really quick observation of working with some folks on government contract this week.

00:11:14.000 --> 00:11:23.000
So an Hp. Proliant, a DL360 model. It's a 1U rack mount server.

00:11:23.000 --> 00:11:28.000
If you go to the HP site right now, it lists for $70,000.

00:11:28.000 --> 00:11:37.000
Uh, the government. Until today was getting them for $10,800.

00:11:37.000 --> 00:11:46.000
And you're on… well, I found out about it on.

00:11:46.000 --> 00:11:53.000
I found out about it on Monday, but that $10,800 price government purchase jumped up to 20.

00:11:53.000 --> 00:12:09.000
24,500. So what is that? Steep increase. Yeah, steep increase. So and the price increase was attributed to.

00:12:09.000 --> 00:12:18.000
the memory and the solid state disks. The chip cost is just skyrocketing.

00:12:18.000 --> 00:12:33.000
I've also heard some stuff just from other sources that you're thinking, well, I don't buy 1U servers of that quality. Well, even the price of laptops is supposedly going to be skyrocketing.

00:12:33.000 --> 00:12:37.000
Any other general comments or questions? Can you hear me? Yes, Steve.

00:12:37.000 --> 00:12:42.000
Hey, Gary, can you hear me now?

00:12:42.000 --> 00:12:43.000
Yeah?

00:12:43.000 --> 00:12:44.000
Yeah, yes.

00:12:44.000 --> 00:12:53.000
we have 2 Steve's logged in. Zoom can't deal with two people with the same name. You got to do something.

00:12:53.000 --> 00:12:57.000
Well, let me log out and… yeah, two of them?

00:12:57.000 --> 00:12:58.000
I don't think that's a problem.

00:12:58.000 --> 00:13:00.000
Especially if there's a person.

00:13:00.000 --> 00:13:01.000
I'm not sure.

00:13:01.000 --> 00:13:02.000
How in the world do you change your name?

00:13:02.000 --> 00:13:04.000
Well, change your name with Steve Stefan.

00:13:04.000 --> 00:13:08.000
I'll just leave and come back.

00:13:08.000 --> 00:13:10.000
I won't do any good if there's another Steve on here with a known last name.

00:13:10.000 --> 00:13:13.000
I don't think that matters, Stan, because it uses a unique ID number.

00:13:13.000 --> 00:13:16.000
It does matter. I've seen it. I've seen it mess up real bad. It does matter on Zoom.

00:13:16.000 --> 00:13:23.000
Okay. Okay.

00:13:23.000 --> 00:13:29.000
Anything else? I'm ready to go! I'm champing at the bit!

00:13:29.000 --> 00:13:32.000
Thank you. We have a lot to cover.

00:13:32.000 --> 00:13:33.000
Okay.

00:13:33.000 --> 00:13:37.000
Okay. We're turning it over to Scott. We got a lot of material to cover. Oh, by the way, Scott, Scott, you want people to interrupt you or wait until you ask for questions?

00:13:37.000 --> 00:13:40.000
I think it's okay once we get to clawed code, do people ask questions?

00:13:40.000 --> 00:13:41.000
Uh, but if you could hold off until we get there, that'd be good.

00:13:41.000 --> 00:13:45.000
Okay. Very good. Thank you, sir, and go.

00:13:45.000 --> 00:13:53.000
Okay, so we have a slight problem. My co-presenter tonight, James Carton, who you've got… many of you have seen,

00:13:53.000 --> 00:13:57.000
Um, he's getting a colonoscopy tomorrow.

00:13:57.000 --> 00:13:59.000
So, about…

00:13:59.000 --> 00:14:02.000
An hour and 45 minutes ago…

00:14:02.000 --> 00:14:06.000
He started drinking the necessary stuff,

00:14:06.000 --> 00:14:10.000
And about 10 minutes ago, much to his surprise,

00:14:10.000 --> 00:14:19.000
He suddenly had the effect of drinking that stuff an hour and 45 minutes ago, so he will be joining us when he can.

00:14:19.000 --> 00:14:22.000
And if he is in the middle of presenting, and all of a sudden he says, I gotta go, and runs out…

00:14:22.000 --> 00:14:27.000
For one or two minutes at a time.

00:14:27.000 --> 00:14:31.000
Don't make fun of the poor guy, but uh… that's what's going on.

00:14:31.000 --> 00:14:34.000
So, I just wanted to make sure everybody understood that.

00:14:34.000 --> 00:14:35.000
Alright, let me share my screen.

00:14:35.000 --> 00:14:39.000
Yep.

00:14:39.000 --> 00:14:40.000
I know. We're a bunch of old people, we know. We've all been there.

00:14:40.000 --> 00:14:44.000
We've always been, we've all been there.

00:14:44.000 --> 00:14:48.000
Alright.

00:14:48.000 --> 00:14:49.000
Let's share the screen. Okay.

00:14:49.000 --> 00:14:50.000
I hate being elderly.

00:14:50.000 --> 00:14:52.000
So, let me turn off this little window…

00:14:52.000 --> 00:14:58.000
Cool. Okay, so I'm gonna go ahead and start. Now, normally, obviously, we have a tutorial,

00:14:58.000 --> 00:15:01.000
And then we have the main presentation,

00:15:01.000 --> 00:15:04.000
I'm gonna kind of give a pseudo-tutorial,

00:15:04.000 --> 00:15:09.000
To begin with, and then I'm going to…

00:15:09.000 --> 00:15:14.000
Um, we're gonna dump… jump into Claude Code officially.

00:15:14.000 --> 00:15:16.000
And that will…

00:15:16.000 --> 00:15:23.000
be, like, the main one, so I'm kind of giving the tutorial now, because I'm going to give you guys a broad overview.

00:15:23.000 --> 00:15:26.000
Um, so here we go, let's jump in.

00:15:26.000 --> 00:15:28.000
So, the title of this talk…

00:15:28.000 --> 00:15:31.000
And by the way, now that Janz is in here, I can say a few things.

00:15:31.000 --> 00:15:34.000
Because he and I were arguing about a couple of terms.

00:15:34.000 --> 00:15:38.000
So, does somebody have their mic on? Because I hear feedback.

00:15:38.000 --> 00:15:42.000
Does somebody have their mic? Because I'm hearing…

00:15:42.000 --> 00:15:44.000
like that. Okay, if somebody…

00:15:44.000 --> 00:15:45.000
No, if somebody could turn the mic off, I'd appreciate… whoever has their mic on, I'd appreciate it.

00:15:45.000 --> 00:15:48.000
Now I do, now I don't.

00:15:48.000 --> 00:16:01.000
So, this is about clawed code, and what I said to Gary was that the world kind of changed in November of 2025. So, we're talking less than 6 months ago.

00:16:01.000 --> 00:16:10.000
The world changed. And that's why I wanted to give this… Janz and I wanted to give this talk tonight, because we're in a new world, as I'm about to demonstrate to you.

00:16:10.000 --> 00:16:22.000
And I know it's a world some of you already know, um, I know it's a world some of you don't know anything about. And so, we're gonna try to cover things. I'm still amazed by what we can do now that we couldn't do.

00:16:22.000 --> 00:16:35.000
Six months ago, a year ago, uh, 3 or 4 years ago, when I first started talking about this stuff with you. Literally, I just… I'm like, it is amazing to me that we have tools now that can do these things.

00:16:35.000 --> 00:16:40.000
So, I'm hope… hopefully that'll be true for the rest of you as well. Now, I read…

00:16:40.000 --> 00:16:49.000
Hours every day about a wide variety of topics. I am on Hacker News, reading what people say there. I use Ars Technica. I like to read the comments there.

00:16:49.000 --> 00:16:55.000
And, uh, I also use Mastodon for my social media, and Blue Sky for social media.

00:16:55.000 --> 00:17:00.000
And one of the topics that people discuss everywhere is AI. And…

00:17:00.000 --> 00:17:09.000
Uh, generative AI of all sorts of topics, and I'm here to tell you, and maybe some of you already know this, there is a lot of knee-jerk hatred

00:17:09.000 --> 00:17:17.000
about AI tools out there. People just… you say the word AI, and it's kind of like in the 50s or 60s saying communist.

00:17:17.000 --> 00:17:26.000
It's like, they quit thinking, and they just react viscerally. Now, some people have reasonable reasons, and I understand that, but a lot of it's just…

00:17:26.000 --> 00:17:29.000
It's very knee-jerk, I hate that.

00:17:29.000 --> 00:17:33.000
And they don't want to engage any farther, and so I wanted to show you two of these.

00:17:33.000 --> 00:17:41.000
Uh, to give you an idea what, you know, people are saying, not everybody, certainly, but this is definitely out there.

00:17:41.000 --> 00:17:46.000
So, Blue Sky went down the other day, and people immediately blamed AI coding.

00:17:46.000 --> 00:17:57.000
And it wasn't. That wasn't the cause, but that didn't prevent hundreds of thousands of people from saying, oh, AI coding, that's causing the problem. So this person, this is 2 days ago,

00:17:57.000 --> 00:18:15.000
With the amazing functionality Blue Sky is showing today, here's a hot take. Any developer or programmer using vibe coding, or any reliance on AI to code things is clearly too stupid to know how to do the job they're paid to do, and should be fired out of a cannon.

00:18:15.000 --> 00:18:18.000
Coding takes skill, not slop.

00:18:18.000 --> 00:18:20.000
And here's a second one.

00:18:20.000 --> 00:18:23.000
AI is a slot machine, plain and simple.

00:18:23.000 --> 00:18:29.000
People are excited because it's new, but it is still not better than a professional at anything.

00:18:29.000 --> 00:18:34.000
Hell, it is not better than even an idiot at most things.

00:18:34.000 --> 00:18:41.000
And I've shown these quotes like this to you guys every time I've talked about AI, and I've always had the same response I say to you guys.

00:18:41.000 --> 00:18:47.000
And that is, I don't know what they're trying to do with AI that they have these opinions. Because…

00:18:47.000 --> 00:18:55.000
It is so incredibly useful to me, it's incredibly useful to JANs. I read online all the time about people that are using this,

00:18:55.000 --> 00:19:02.000
every day to do jobs they couldn't do before, or to do it a lot faster, or more efficiently.

00:19:02.000 --> 00:19:08.000
We're using it to save money, because we don't have to hire as many outside developers now, for instance.

00:19:08.000 --> 00:19:16.000
And as a counter to the hate, I wanted to show you this one response that I saw, uh, yesterday as well.

00:19:16.000 --> 00:19:22.000
Actually, from February 12th until December of last year, I was using LLMs as fancy autocomplete for coding.

00:19:22.000 --> 00:19:30.000
It was nice for scaffolding out boilerplate, or giving me a gut check on some things, or banging out some boring routine stuff.

00:19:30.000 --> 00:19:36.000
In the past two months, Claude has written about 99% of my code, things are changing.

00:19:36.000 --> 00:19:41.000
Fast. And that's what I want to present to you guys tonight.

00:19:41.000 --> 00:19:51.000
Because that's been my experience. Now, 99%, that varies. I asked Jans today, um, how much do you think Claude's written in your code? Like, 90%? And he said, I have no idea.

00:19:51.000 --> 00:19:56.000
Maybe around that? Maybe not? So it's gonna vary for different people.

00:19:56.000 --> 00:20:01.000
But more and more people are using it more and more, and I think by the end of this, you're going to see why.

00:20:01.000 --> 00:20:09.000
So, why now? Why do we want to have this talk now? What changed? Well, November 2025 is the inflection point.

00:20:09.000 --> 00:20:15.000
Because what happened then was that several frontier LLM models and coding surfaces

00:20:15.000 --> 00:20:19.000
crossed a threshold at the same time.

00:20:19.000 --> 00:20:24.000
And what happened was that all of these, basically, were aimed directly

00:20:24.000 --> 00:20:31.000
at agentic coding Workflows. Long, not short, like one-off, hey, how do you code this?

00:20:31.000 --> 00:20:38.000
Much longer, and also the ability to use tools. Now, I'm gonna break that down for you and explain exactly what that means.

00:20:38.000 --> 00:20:46.000
So, in terms of OpenAI and ChatGPT, they pushed AI coding way beyond autocomplete at that point.

00:20:46.000 --> 00:20:52.000
Uh, GPT-5.1 came out, and it was much, much better, way leaps and bounds better.

00:20:52.000 --> 00:20:55.000
And working through multi-step programming tasks.

00:20:55.000 --> 00:20:58.000
In fact, it can now choose when to think harder,

00:20:58.000 --> 00:21:05.000
So you don't need to manually adjust it anymore as, like, you know, normal, uh, thinking, it automatically will do that.

00:21:05.000 --> 00:21:11.000
It knows and chooses when to answer quickly and when not, and when to use tools like the shell

00:21:11.000 --> 00:21:16.000
to help it solve problems, which made it a lot more useful for editing files,

00:21:16.000 --> 00:21:22.000
running commands, testing code, and fixing problems across an entire project.

00:21:22.000 --> 00:21:28.000
Anthropic, maker of Clod, man, they have been at the forefront of this. That's why we're focusing on them tonight.

00:21:28.000 --> 00:21:34.000
And they have pushed longer, more independent coding work, and they're clearly in the lead now.

00:21:34.000 --> 00:21:42.000
So, Claude Opus 45, which was released about this time, was built to stay on test longer without losing the thread of what was going on.

00:21:42.000 --> 00:21:50.000
It was specifically aimed at bigger coding tasks that take a lot of steps, not just little one, you know, functions or code snippets.

00:21:50.000 --> 00:21:57.000
And it helped make AI feel like a collaborator that could actually keep working through a problem with you.

00:21:57.000 --> 00:22:04.000
Now, also, Google. I'm not gonna leave Google out, you guys know I'm not the biggest Google fan, but when they do something interesting or good, I'll give them credit.

00:22:04.000 --> 00:22:12.000
They push their AI agents that work across developer tools, especially, again, the terminal. Notice this focus on the terminal.

00:22:12.000 --> 00:22:16.000
Google introduced a piece of software called Anti-Gravity.

00:22:16.000 --> 00:22:24.000
Which is an AI-assisted coding tool that can work across editors, terminals, and browsers to do the job.

00:22:24.000 --> 00:22:34.000
And again, it was transitioning to tool-using software assistance. So now, they can actually do things for you, and I'm going to show you that.

00:22:34.000 --> 00:22:40.000
Now, I also said the coating surfaces change, so the coating surface is where you use these models.

00:22:40.000 --> 00:22:54.000
where you use the AI tools. So, for instance, in VS Code, you might have a side panel open that VS Code… that codecs, which is made by OpenAI, or Claude is running, or you may just be using the terminal to interact with it.

00:22:54.000 --> 00:22:59.000
or an actual app, like the Claude app or Codex, which is an app.

00:22:59.000 --> 00:23:03.000
Uh, all of those things, those would be the coding surface, but those improved.

00:23:03.000 --> 00:23:12.000
Because all of a sudden, they had more awareness of files, and how they worked together. They are really good at diff…

00:23:12.000 --> 00:23:18.000
seeing diffs, and then patching much better, and in fact, using Git, they're really good now at Git.

00:23:18.000 --> 00:23:21.000
Uh, terminal access. They're great at running tests now.

00:23:21.000 --> 00:23:27.000
uh, repo-wide context where they can understand all the files in a repo in a project.

00:23:27.000 --> 00:23:40.000
Uh, tighter integration with the editor, and less friction. So now, you can make a suggestion and say, what do you think? It'll come back with, uh, solutions, you can pick one, and it'll actually go ahead and do it for you.

00:23:40.000 --> 00:23:43.000
So, let's look at some of the tooling we're talking about.

00:23:43.000 --> 00:23:48.000
Now, let's dig into what modern AI tools can do for developers.

00:23:48.000 --> 00:23:54.000
So, AI tools, and this is a biggie, can now understand entire codebases.

00:23:54.000 --> 00:24:07.000
Before, you might have had to look at a file, um, or maybe two files, but now it can read across all the files in a project, in a repo, and look at multiple files at once, and understand how they relate together.

00:24:07.000 --> 00:24:13.000
and edit those files. This was a big thing, we now have what's called a larger context window.

00:24:13.000 --> 00:24:22.000
So the context window is the amount of information an AI can use to understand what you want and what it should pay attention to.

00:24:22.000 --> 00:24:29.000
And I like to use the analogy, you know, several years ago, it was like we had one megabyte of RAM, and most of us remember what that was like.

00:24:29.000 --> 00:24:39.000
And now, the contexts are much bigger. You can have up to a million tokens in some context, and that's like moving from one Mega RAM to 64GB.

00:24:39.000 --> 00:24:46.000
In the same way, we can address a lot more on the computer once we have more RAM, the larger context allows it to

00:24:46.000 --> 00:24:50.000
understand a lot more and work with a lot more.

00:24:50.000 --> 00:24:59.000
AI tools can now look up things locally or on the web, and run terminal commands to perform these tasks. It's pretty amazing when you see it.

00:24:59.000 --> 00:25:06.000
They have better, more useful built-in behavior now, so… and I'm gonna demonstrate all this to you tonight.

00:25:06.000 --> 00:25:15.000
They will help you with planning your project, they will help you with brainstorming, and creating documentation. They will then iterate as needed throughout the project.

00:25:15.000 --> 00:25:19.000
And they will do automatic testing for you, which is really wonderful.

00:25:19.000 --> 00:25:23.000
And in fact, this is… I use the word agentic coding.

00:25:23.000 --> 00:25:33.000
One of the big, big things that's now used everywhere is agents. Agents are the main… one of the major parts of this new era.

00:25:33.000 --> 00:25:39.000
of coding. And so, an agent is an AI program that can take a goal you set,

00:25:39.000 --> 00:25:41.000
Decide what steps to take,

00:25:41.000 --> 00:25:53.000
And then use tools to try to complete the task. It's pretty amazing. And so, let's look at it like this. Let's say you have a goal. You want something done. You want to create a website, or you want to create an app.

00:25:53.000 --> 00:26:02.000
Right? Or whatever. You have a goal. The agentic coding tool might be Claude Code, or OpenAI's Codex, or

00:26:02.000 --> 00:26:05.000
Google's, uh, tool, or something else similar.

00:26:05.000 --> 00:26:11.000
The agent is the AI working within that tool to get the job done.

00:26:11.000 --> 00:26:20.000
And it figures out what it needs to do, what order, how to do it. Well, it may, may spawn sub-agents,

00:26:20.000 --> 00:26:26.000
In each of those sub-agents have a specific role in completing the goal, and they report back to the agent,

00:26:26.000 --> 00:26:28.000
called the Orchestrator,

00:26:28.000 --> 00:26:38.000
And it keeps track of what's going on, and keeps track of the progress and what's happening to get to the point where you've completed the project. You complete the goal.

00:26:38.000 --> 00:26:40.000
So, um,

00:26:40.000 --> 00:26:46.000
The thing to remember, though, is agents don't always spawn sub-agents, so if you're building a web page,

00:26:46.000 --> 00:26:50.000
are working on a function, you may very well not. They don't need sub-agents, it's not needed.

00:26:50.000 --> 00:26:56.000
But if you're redesigning an entire website, or let's say you're debugging a large codebase with a lot of files,

00:26:56.000 --> 00:27:00.000
Sub-agents may vary well enter into the picture.

00:27:00.000 --> 00:27:07.000
And in fact, one of the things Dan said, and I've heard this, I told him when he told me this, I said, oh, that's a common analogy. He said, oh, I didn't know.

00:27:07.000 --> 00:27:09.000
He said that

00:27:09.000 --> 00:27:12.000
In the past, he felt like a musician.

00:27:12.000 --> 00:27:23.000
Focusing on playing his instrument, okay? So, I need to edit these HTML files, I need to edit these CSS files, I need to update this JavaScript file.

00:27:23.000 --> 00:27:30.000
And so, he would put that file into OpenAI, or Claude, and it would work on that.

00:27:30.000 --> 00:27:35.000
Right? But he said, now, he's more like the conductor of an orchestra.

00:27:35.000 --> 00:27:38.000
He's pointing out, this is where I want to go,

00:27:38.000 --> 00:27:45.000
And the various agents, then, are doing the task, and he's overseeing the agents, and perhaps their sub-agents.

00:27:45.000 --> 00:27:51.000
as they perform the task. And again, that's a common analogy many people have used.

00:27:51.000 --> 00:27:58.000
Where the developer, or our admin, or whatever, has gone from being a musician to the conductor.

00:27:58.000 --> 00:28:03.000
And so what happened is, now we can leverage all this tooling…

00:28:03.000 --> 00:28:09.000
To do a wide variety of tasks. You can actually tell the AI, and again, I'm going to demonstrate this to you all tonight,

00:28:09.000 --> 00:28:11.000
You tell the AIM model, I want to do something,

00:28:11.000 --> 00:28:13.000
it'll go right the code to do it.

00:28:13.000 --> 00:28:19.000
And what it's doing is, it's now filling gaps that were previously required specialists to do that.

00:28:19.000 --> 00:28:30.000
So, in our case, we were hiring a company, we use the main CMS we use is called Concrete CMS, thanks to Craig Buchek for suggesting it about 15 years ago. We're still using it.

00:28:30.000 --> 00:28:36.000
And, um, we used to hire a company out of Minneapolis to do some work for us,

00:28:36.000 --> 00:28:48.000
writing add-ons and extensions for the CMS. We hardly use them anymore, because now we use Claude Code to write it up, and in fact, Gans was telling me that just the other day,

00:28:48.000 --> 00:28:57.000
He had needed something written, and the woman we normally hire was having a really hard time doing it. She couldn't figure out how to do it.

00:28:57.000 --> 00:29:04.000
Part of that's because concrete's documentation is spread all over the place, and a human has to kind of go into nooks and crannies and

00:29:04.000 --> 00:29:08.000
discussion forums to look for answers and things like that.

00:29:08.000 --> 00:29:20.000
Well, she said, I can't figure this out, guys. So, Jan's fired up Claude Code, and in about 20 minutes, he said he had it working, it had solved it, it had ingested all the documentation everywhere,

00:29:20.000 --> 00:29:24.000
It knew what to do, and it was working.

00:29:24.000 --> 00:29:25.000
I'm not a liar. Oh, I'm not! Oh, you're back!

00:29:25.000 --> 00:29:27.000
You're a liar.

00:29:27.000 --> 00:29:29.000
If I'm lying, tell me the truth!

00:29:29.000 --> 00:29:30.000
Tell the truth.

00:29:30.000 --> 00:29:32.000
I did not say anything about how long it took me.

00:29:32.000 --> 00:29:33.000
Oh, I thought it did, I'm sorry.

00:29:33.000 --> 00:29:35.000
You added that yourself.

00:29:35.000 --> 00:29:37.000
Okay, it took him 1 minute.

00:29:37.000 --> 00:29:38.000
One minute.

00:29:38.000 --> 00:29:40.000
Right. No, it took a couple hours.

00:29:40.000 --> 00:29:41.000
Okay, my apologies.

00:29:41.000 --> 00:29:45.000
Because, you know, involved in all the… you know…

00:29:45.000 --> 00:29:53.000
you'll find that a large amount of your time, just like working with human developers and whatnot, you do a lot of communication and planning and analysis.

00:29:53.000 --> 00:29:55.000
And all of that kind of stuff.

00:29:55.000 --> 00:30:00.000
And so that stuff is still involved. The actual code writing,

00:30:00.000 --> 00:30:02.000
It goes pretty quick.

00:30:02.000 --> 00:30:09.000
But the whole task, and the whole task is shortened significantly, obviously, but it's not like a magic wand where you, like, you know,

00:30:09.000 --> 00:30:13.000
Read my mind and make it do what I want. You know, you still have to… you still have to.

00:30:13.000 --> 00:30:21.000
Know what you want, know how the system works, etc. Like you said, with a conductor, a conductor is not just somebody that runs out there and waves their arms around.

00:30:21.000 --> 00:30:23.000
They don't have to know what the they're doing.

00:30:23.000 --> 00:30:25.000
Right? Thank you.

00:30:25.000 --> 00:30:29.000
And so, the capper for all this stuff we're talking about is,

00:30:29.000 --> 00:30:39.000
You just talk to it. You use natural language. You don't have any specific commands to use, we don't… you don't need a specific vocabulary or knowledge of certain syntax.

00:30:39.000 --> 00:30:45.000
at all, you simply tell it what you want done, and it will figure out how to do it, and then during the iteration process, you can simply tell it.

00:30:45.000 --> 00:30:48.000
I'm going to interrupt you again, Scott.

00:30:48.000 --> 00:30:49.000
Go ahead.

00:30:49.000 --> 00:30:56.000
Because you said words like just and whatnot. It's still… and you do have to know what to tell it, and you do need certain syntax.

00:30:56.000 --> 00:30:57.000
Of course.

00:30:57.000 --> 00:31:01.000
You just need to know what you're talking about. You just don't have to… you're just not confined to just that syntax.

00:31:01.000 --> 00:31:07.000
You can use any number of descriptive ways to tell it what you want, but you still have to know what's capable,

00:31:07.000 --> 00:31:10.000
and how to build it, and all this other kind of stuff.

00:31:10.000 --> 00:31:13.000
Again, if you just.

00:31:13.000 --> 00:31:16.000
Give it vague garbage, it doesn't do it, and I've seen…

00:31:16.000 --> 00:31:22.000
I've seen you, in particular, and other people have problems getting it to do things and do it as well.

00:31:22.000 --> 00:31:25.000
Because the way the way you construct your, your.

00:31:25.000 --> 00:31:28.000
requests, and you're like, why is it doing that?

00:31:28.000 --> 00:31:31.000
You'll get… you'll get the hang of it the more time you spend on it.

00:31:31.000 --> 00:31:32.000
Absolutely.

00:31:32.000 --> 00:31:38.000
But that's why they call him an agentic engineers and whatnot, that's, that's a job description these days.

00:31:38.000 --> 00:31:41.000
is having some experience in working with the agents.

00:31:41.000 --> 00:31:43.000
Okay, so…

00:31:43.000 --> 00:31:47.000
Let's look at the big picture of AI before we jump into Claude specifically.

00:31:47.000 --> 00:31:51.000
So I first want to talk about uses, because now the uses for this stuff's exploded.

00:31:51.000 --> 00:32:01.000
So, I'm categorizing various AI tools in these groups here, and I could have done a lot more, but I wanted to focus on these. I think this is ones people will use.

00:32:01.000 --> 00:32:07.000
Commonly. So, of course, let's start with just general knowledge and chatting to find out information.

00:32:07.000 --> 00:32:13.000
So, the major players here are easily ChatGPT, Claude.ai, and Google Gemini.

00:32:13.000 --> 00:32:20.000
So, those are the ones people, most normal people think of when they think of AI tools. They have ChatGPT installed on their phone.

00:32:20.000 --> 00:32:24.000
or Claude installed, or maybe Google Gemini.

00:32:24.000 --> 00:32:27.000
Um, and they ask questions, they use it for research.

00:32:27.000 --> 00:32:33.000
Uh, things like that. Now, the interesting thing about Gemini is, you know, as you probably all know,

00:32:33.000 --> 00:32:35.000
Apple has had a heck of a time.

00:32:35.000 --> 00:32:38.000
Uh, with, uh, AI tools.

00:32:38.000 --> 00:32:44.000
Uh, with their own, their own suck. They've had Siri for 15 years now, and Siri pretty much sucks.

00:32:44.000 --> 00:32:47.000
unbelievably poorly compared to these other tools.

00:32:47.000 --> 00:32:53.000
And Apple's been trying to develop its own, but hasn't been doing a good job, and so they made a deal with Google,

00:32:53.000 --> 00:32:59.000
And starting this with the next version of iOS, macOS, iPadOS, etc.,

00:32:59.000 --> 00:33:05.000
coming up this fall, uh, they're going to be integrating Google Gemini all throughout,

00:33:05.000 --> 00:33:12.000
However, it's all gonna be running on Apple's servers, so Google doesn't have access to it, which is why I'm okay with it.

00:33:12.000 --> 00:33:16.000
Um, Google's not gonna get access to everything, anything at all.

00:33:16.000 --> 00:33:19.000
Apple's gonna keep them firewalled away from it.

00:33:19.000 --> 00:33:27.000
But we're soon, hopefully, if you use an Apple device, you'll have much better Siri, much better control of your system, much more automations.

00:33:27.000 --> 00:33:29.000
But those are the big three.

00:33:29.000 --> 00:33:35.000
Now, we have what we call Tier 1.5, like, these are significant, they're not as big as the main ones.

00:33:35.000 --> 00:33:40.000
Meta AI has its own, using a model called Llama,

00:33:40.000 --> 00:33:45.000
And so, I avoid using Meta at all costs. I don't like the company at all.

00:33:45.000 --> 00:33:52.000
Uh, but lots of people do, and so Meta AI is built into WhatsApp, Instagram, Facebook, their big three.

00:33:52.000 --> 00:33:57.000
But they also have those Ray-Ban glasses, they have Meta AI access in them as well.

00:33:57.000 --> 00:34:00.000
And then we have Microsoft Copilot.

00:34:00.000 --> 00:34:09.000
Now, this is true, I almost… I forgot to put the graphic on here, about 2 days ago, I saw an article online, and this guy actually went through and counted

00:34:09.000 --> 00:34:15.000
And said, Microsoft uses the term co-pilot with 75

00:34:15.000 --> 00:34:18.000
different services and programs.

00:34:18.000 --> 00:34:26.000
So, co-pilots just plastered over everything. So when I say co-pilot here, I'm talking about the free consumer cat bot.

00:34:26.000 --> 00:34:31.000
Right? And that actually runs on ChatGPT, it's free, it has web search built in.

00:34:31.000 --> 00:34:36.000
Um, but it's okay, it's okay. Barely okay in my book.

00:34:36.000 --> 00:34:38.000
And then we have the second tier, so DeepSeek.

00:34:38.000 --> 00:34:45.000
is still around. It's a Chinese model that, when it was released in January of 2025, freaked everybody out.

00:34:45.000 --> 00:34:50.000
Because it was matching ChatGPT4 performance at a fraction of the cost in compute.

00:34:50.000 --> 00:34:56.000
Quinn by Alibaba is the world's most downloaded open-source model that runs locally.

00:34:56.000 --> 00:35:02.000
And finally, I have to mention, uh, XAI Grok. This is Elon Musk's entry.

00:35:02.000 --> 00:35:07.000
That's built into Shitter. Um, it has a user base due to platform integration, because…

00:35:07.000 --> 00:35:15.000
people use shitter, and there it is, right there. And it's also used by people that don't care that it's biased, and full of child sexual abuse material.

00:35:15.000 --> 00:35:18.000
So, there you go. That's the kind of people using XAI Grok.

00:35:18.000 --> 00:35:24.000
Um, what about image generation? Well, one of the biggies there is mid-journey, still popular and used.

00:35:24.000 --> 00:35:29.000
Dolly, which you can access via ChatGPT, I've used that many times.

00:35:29.000 --> 00:35:34.000
Adobe Firefly, and Nano Banana came out several months ago.

00:35:34.000 --> 00:35:42.000
Uh, it's… you… it's, uh, you access that via Google Gemini, and it's actually really cool, very powerful, can produce some neat stuff.

00:35:42.000 --> 00:35:48.000
Video generation, the biggie here is that Sora was discontinued just earlier this month.

00:35:48.000 --> 00:35:50.000
Uh, because it was costing them,

00:35:50.000 --> 00:35:55.000
a million-plus dollars a day, but oh, less than 500,000 people were using it.

00:35:55.000 --> 00:36:03.000
did not have adoption, but tools like Runway and Google Vio are used to generate video, and that's getting to be more and more.

00:36:03.000 --> 00:36:08.000
Uh, audio and musical generation, we have Suno for music, and Eleven Labs will turn voice.

00:36:08.000 --> 00:36:11.000
to, uh, turn text-to-speech.

00:36:11.000 --> 00:36:16.000
or voice-to-speech, and do a dictation, basically.

00:36:16.000 --> 00:36:18.000
Now, what about work automation?

00:36:18.000 --> 00:36:21.000
Well, Claude Co-work is an aspect of Claude.

00:36:21.000 --> 00:36:26.000
that you can use, and it's meant for more quote-unquote normal people that aren't

00:36:26.000 --> 00:36:32.000
developers or coders, and it basically does some automation on your computer to get certain tasks done,

00:36:32.000 --> 00:36:36.000
It integrates with Excel, for instance, and PowerPoint.

00:36:36.000 --> 00:36:43.000
Microsoft… and there's another co-pilot… Microsoft 365 Copilot is actually supposed to be pretty good,

00:36:43.000 --> 00:36:52.000
But it's built into my… what we used to call Microsoft Office. So, it only is good if you're completely encased in the Microsoft ecosystem.

00:36:52.000 --> 00:36:55.000
In that case, this supposedly does a pretty good job.

00:36:55.000 --> 00:37:00.000
And then we have other tools for productivity, like Zapier and Notion.

00:37:00.000 --> 00:37:07.000
Then we get research and deep analysis. Well, ChatGPT is excellent at doing deep research. I've used it, I know Janz has used it.

00:37:07.000 --> 00:37:11.000
Claude can do extensive research and deep analysis as well.

00:37:11.000 --> 00:37:17.000
And there's a tool called Perplexity out there that is search-based, but it will do research, and it's rather controversial.

00:37:17.000 --> 00:37:21.000
Because it does things like ignore, uh,

00:37:21.000 --> 00:37:26.000
on websites, when people say, do not index my site, please, and it will ignore that and index it anyway.

00:37:26.000 --> 00:37:30.000
So, they're a little controversial in that regard.

00:37:30.000 --> 00:37:32.000
Then, I want to focus on this for just a moment.

00:37:32.000 --> 00:37:41.000
Because I think this is gonna be the future for a lot of AI usage, and that's going to be vertical usage or industry-specific.

00:37:41.000 --> 00:37:45.000
So, what's gonna happen is various AI tools are gonna focus

00:37:45.000 --> 00:37:51.000
targeted on a specific industry, trained on materials from that industry.

00:37:51.000 --> 00:37:58.000
And that's going to be more widely used by people in those industries. So, for instance…

00:37:58.000 --> 00:38:06.000
Open evidence is an AI specifically for medical doctors. In fact, if you look at the diagram,

00:38:06.000 --> 00:38:12.000
Down, there are 3 lines, it says NPI required. That is a physician ID number.

00:38:12.000 --> 00:38:17.000
So, this… you have to have a physician ID number to use this tool.

00:38:17.000 --> 00:38:19.000
It's not meant for the general public.

00:38:19.000 --> 00:38:25.000
And what makes it… and I know doctors that use this. They actually open it up immediately and look on it.

00:38:25.000 --> 00:38:28.000
Why? Because it's been trained on peer-reviewed research,

00:38:28.000 --> 00:38:44.000
from sources like the New England Journal of Medicine, the Journal of the American Medical Association, uh, NCCN has to do with cancer, and so on, and they're partnered with all these other very legitimate medical organizations, like the American Dental Association.

00:38:44.000 --> 00:38:47.000
And so on, they rigorously cite

00:38:47.000 --> 00:38:55.000
everything they produce. And so, doctors trust it, and more and more and more are using it, because it's not some general tool.

00:38:55.000 --> 00:39:04.000
that goes to Google and searches there. It is very tightly focused and trained on actual good, known medical sources.

00:39:04.000 --> 00:39:07.000
Uh, Bloomberg, of the Bloomberg terminals.

00:39:07.000 --> 00:39:14.000
Uh, they have Bloomberg Intelligence that you can access. That's been trained on financial data, news, and market information.

00:39:14.000 --> 00:39:20.000
Uh, Autodesk is, um, used by people in the architecture and construction business,

00:39:20.000 --> 00:39:25.000
It's been trained on building code, structural data, a computer-aided design workflows,

00:39:25.000 --> 00:39:30.000
And so, that's been useful for people involved in that industry.

00:39:30.000 --> 00:39:33.000
And finally, I found this one really interesting, because I'm very much…

00:39:33.000 --> 00:39:39.000
a guy that's interested in movies, and that is an organization, a company called Interpositive,

00:39:39.000 --> 00:39:43.000
And it's meant for filmmakers. Now, a lot of Hollywood's freaking out.

00:39:43.000 --> 00:39:52.000
Because they're worried it's going to take a lot of jobs, and what's going to happen to the whole idea of, you know, human beings creating these films. Ben Affleck,

00:39:52.000 --> 00:39:57.000
Founded this company in 2022, it was just acquired by Netflix a few weeks ago.

00:39:57.000 --> 00:39:59.000
It's interesting because

00:39:59.000 --> 00:40:04.000
You… if you're a filmmaker, and you use it, it actually

00:40:04.000 --> 00:40:08.000
trains itself on your film's existing footage.

00:40:08.000 --> 00:40:10.000
That's what it learns from.

00:40:10.000 --> 00:40:15.000
And then it lets filmmakers use this tool, Inner Positive, in post-production.

00:40:15.000 --> 00:40:24.000
So they can do things like color grading, relighting shots, adding visual effects, and so on, and because it's already been trained on the footage you've filmed,

00:40:24.000 --> 00:40:28.000
The stuff it creates is gonna fit in with that much, much, much better.

00:40:28.000 --> 00:40:30.000
I find that fascinating.

00:40:30.000 --> 00:40:32.000
But of course, what we're talking about tonight…

00:40:32.000 --> 00:40:35.000
Our coding and technical tasks.

00:40:35.000 --> 00:40:43.000
So, of these, the big three are Claude Code, which is terminal native, as you're gonna see tonight, use the terminal.

00:40:43.000 --> 00:40:53.000
A dentic coating all the way. This is the most… I would say, it's not the most widely used out there, because, as you're going to see in the next slide,

00:40:53.000 --> 00:40:58.000
Uh, co-pilot's the most widely used, but the reason is because

00:40:58.000 --> 00:41:04.000
All these IT people, chuds, that only know Microsoft, where they've already… they're already paying for Microsoft,

00:41:04.000 --> 00:41:08.000
Well, Microsoft says, oh, you can get Copilot, too, and so…

00:41:08.000 --> 00:41:24.000
All these CTOs and IT people are shoving Copilot down the throats of developers, so it's one of the most widely used, but in a survey, only 9% of developers liked it. So they're being forced to use it, but they don't want to use it.

00:41:24.000 --> 00:41:32.000
So, Claude Code, though, has a current 41% market share among professional developers, and it is the most liked.

00:41:32.000 --> 00:41:35.000
That's the ones that they… that's the one that they want to use.

00:41:35.000 --> 00:41:41.000
Now, OpenAI has introduced Codex. It can run in a GUI, or via the command line,

00:41:41.000 --> 00:41:48.000
It's been out, not for several months, about less than 6 months, and it's growing.

00:41:48.000 --> 00:41:53.000
But it's Cloud Code's still ahead, and then cursor is actually a very interesting

00:41:53.000 --> 00:42:01.000
IDE based on VS Code, but completely interspersed with AI everywhere through it.

00:42:01.000 --> 00:42:09.000
And it has around an 18% market share. It's widely used. Interestingly, you can use the built-in AI, or you can choose other models as well.

00:42:09.000 --> 00:42:11.000
Well, that brings us to Copilot!

00:42:11.000 --> 00:42:14.000
Well, I think Copilot for coding,

00:42:14.000 --> 00:42:24.000
It's pretty bad. Uh, I've never been impressed by it. Uh, it's pretty much autocomplete only. It's not really agentic, and this is a damning

00:42:24.000 --> 00:42:34.000
Damning fact, and I read this about a month ago, it turns out that Microsoft's own engineers are testing Claude code internally. They don't want to use Copilot.

00:42:34.000 --> 00:42:44.000
So, think about it. Microsoft's own developers don't want to use the tool their company is trying to sell and forcing other companies to try to use.

00:42:44.000 --> 00:42:50.000
They would rather use Claude code, so if that doesn't tell you something, I don't know what will.

00:42:50.000 --> 00:42:55.000
Okay, then I thought this would be particular of interest to you guys, cloud versus local.

00:42:55.000 --> 00:43:00.000
So, most of the models we've talked about previous to this are cloud models.

00:43:00.000 --> 00:43:09.000
But, some models are meant to be run locally, and I would actually encourage a lot of you guys to check these out. So, why run local models? Well, number one is private.

00:43:09.000 --> 00:43:16.000
OpenAI, Claude, Google isn't getting access or seeing what you're doing.

00:43:16.000 --> 00:43:21.000
Number two, cost. There's no API fees. It's running on your device. You don't have to pay anything.

00:43:21.000 --> 00:43:24.000
Um, you could… you don't need an internet connection.

00:43:24.000 --> 00:43:27.000
You can run it offline if you want.

00:43:27.000 --> 00:43:32.000
Uh, latency, you don't have to wait for it to make a round trip to the server and then come back.

00:43:32.000 --> 00:43:37.000
There's no rate limits. Um, Claude Code, for instance, there are limits on that.

00:43:37.000 --> 00:43:46.000
Uh, you can be throttled at certain times. Uh, well, there's no throttling, and you can run as fast as you want on your hardware whenever you want.

00:43:46.000 --> 00:43:51.000
Uh, regulation compliance, so if you have to deal with HIPAA laws or GDPR laws,

00:43:51.000 --> 00:43:55.000
then you don't have to worry about that, because everything's on your machine.

00:43:55.000 --> 00:44:01.000
Uh, you can fine-tune it and customize it, and finally, you can experiment. You can mess around with all sorts of stuff.

00:44:01.000 --> 00:44:07.000
So, uh, it really allows you to really get your hands dirty and dig in there and mess around with things.

00:44:07.000 --> 00:44:13.000
So, here they are. Llama, which is made by Meta, is widely used.

00:44:13.000 --> 00:44:19.000
Uh, locally, Gemma 4 just came out from Google, like, a week or so ago.

00:44:19.000 --> 00:44:22.000
I downloaded that and started playing with it.

00:44:22.000 --> 00:44:24.000
It's getting very high marks.

00:44:24.000 --> 00:44:31.000
Quinn, which is made by Alibaba, as I mentioned earlier, is the most downloaded open source model. I would definitely check that out.

00:44:31.000 --> 00:44:34.000
Mistral is a French company that's popular in Europe.

00:44:34.000 --> 00:44:39.000
DeepSeek is the Chinese company that caused the market panic. You can access that.

00:44:39.000 --> 00:44:42.000
Now, once you get these, how do you use them?

00:44:42.000 --> 00:44:47.000
We don't have time to go into great detail, but I think this is enough to get a lot of you guys started.

00:44:47.000 --> 00:45:00.000
Olama is a CLI tool that most people use to actually run those models locally. It's what I use. It's not that hard, it's pretty easy, you can find tutorials about it online, but it's really not that complicated.

00:45:00.000 --> 00:45:06.000
So, if you're interested in trying out these local tools, I would definitely get Olama. Now,

00:45:06.000 --> 00:45:16.000
If you don't want to use the CLI directly, you'd feel more comfortable with a GUI. There's a company that makes a tool called LM Studio that provides a GUI, but recently…

00:45:16.000 --> 00:45:19.000
They pulled out the core of that GUI app,

00:45:19.000 --> 00:45:27.000
And called it LLMster, and that's CLI only, meant for servers. So, Ollama, you can not only run locally, you could run it on a server.

00:45:27.000 --> 00:45:33.000
LLMster, you could run it on a server. And what's really cool, and I know everybody here will like this, is…

00:45:33.000 --> 00:45:41.000
All of these tools run on Windows, Mac, and Linux. So, totally available to everybody to mess around with this stuff.

00:45:41.000 --> 00:45:49.000
Now, but there's one thing you need to know. So, all the tools I mentioned, can you download and run them locally? Absolutely.

00:45:49.000 --> 00:45:53.000
Can you use it commercially? Most of them are built with an Apache 2.0 license, yeah!

00:45:53.000 --> 00:45:58.000
Can you see exactly how they were built? Are they really open source? Nope.

00:45:58.000 --> 00:46:00.000
Nope. Mistral is the closest one.

00:46:00.000 --> 00:46:03.000
But Google Gemma? That's not open source.

00:46:03.000 --> 00:46:12.000
They're not gonna let you see exactly how it was built. So, they're available, but don't expect them to be completely transparent, because they're not.

00:46:12.000 --> 00:46:14.000
Okay, model tiers.

00:46:14.000 --> 00:46:23.000
Let's look at this real quick, because when people say claud, when people say OpenAI, it's more complicated than that.

00:46:23.000 --> 00:46:30.000
Because every AI has model tiers. So, OpenAI is famous for the… they have horrible names.

00:46:30.000 --> 00:46:34.000
For the various versions of their AI tools.

00:46:34.000 --> 00:46:39.000
horrible. So, Mini and Nano are the ones that are always super fast, super cheap, really small.

00:46:39.000 --> 00:46:46.000
Um, they're suitable for very quick stuff and easy stuff, but they're not gonna be good for a vented coating in any way.

00:46:46.000 --> 00:46:50.000
Uh, GPT-53 is the default.

00:46:50.000 --> 00:46:53.000
If you are using CareGPT for free,

00:46:53.000 --> 00:47:00.000
You are using GPT-5.3, you get 10 messages every 5 hours before it drops you down to the mini model.

00:47:00.000 --> 00:47:09.000
GPT-54 thinking does a lot deeper reasoning and deeper thinking. Uh, it's available at paid tiers only, which I'm going to get to in a moment.

00:47:09.000 --> 00:47:15.000
And then ChatGPT54 Pro is the most powerful, it's, like, the most expensive.

00:47:15.000 --> 00:47:20.000
Anthropic is real easy. Anthropic is very stable with its names, it's super easy.

00:47:20.000 --> 00:47:30.000
Haiku, Sonnet, Opus. Haiku is the fast and cheap model. Sonnet is balanced, it's very good, I use it a lot. Opus is the most powerful.

00:47:30.000 --> 00:47:43.000
Um, I was talking with Jans the other day about this, and I said, so, do you use Opus just a few… a little bit at the time? And he said, no, right? Correct me if I'm wrong. Uh, I believe you said you use it almost all the time.

00:47:43.000 --> 00:47:48.000
Because you wanted the greater thinking ability and reasoning ability,

00:47:48.000 --> 00:47:49.000
that it has, is that correct?

00:47:49.000 --> 00:47:52.000
I use Opus…

00:47:52.000 --> 00:47:54.000
all the time for doing work stuff.

00:47:54.000 --> 00:47:59.000
But, since I'm running just a little over 25 to.

00:47:59.000 --> 00:48:01.000
28% of the…

00:48:01.000 --> 00:48:04.000
Of the usage.

00:48:04.000 --> 00:48:07.000
Well, when you look at the pricing and stuff like that, um…

00:48:07.000 --> 00:48:08.000
I'm getting to that.

00:48:08.000 --> 00:48:12.000
I'm up, I'm between the $100 and $200 tier mark.

00:48:12.000 --> 00:48:18.000
So I have lots of extra room, so I just use it all the time.

00:48:18.000 --> 00:48:26.000
Yep. Okay, so Google has Flash, which is the fast and cheap small model, Pro, and Ultra.

00:48:26.000 --> 00:48:29.000
which is the most powerful. Now, this is kind of interesting.

00:48:29.000 --> 00:48:31.000
Unfortunately, again, I don't use Google.

00:48:31.000 --> 00:48:34.000
for search. I quit using them a long time ago.

00:48:34.000 --> 00:48:39.000
When people search Google, and Google's shoving AI answers down everybody's throat now,

00:48:39.000 --> 00:48:45.000
You have no idea which model of these Google used. You have no idea if it was Flash or Pro, maybe it was Ultra?

00:48:45.000 --> 00:48:47.000
Maybe? You have no idea.

00:48:47.000 --> 00:48:49.000
I want to contrast that with Kagi.

00:48:49.000 --> 00:48:57.000
I've actually talked to you guys before about Coggy Search. I… and Janz uses it too, and I think it's truly the best search engine you can use today.

00:48:57.000 --> 00:49:02.000
Uh, it's far better than Google in many, many ways, and one of the many ways has to do with AI integration.

00:49:02.000 --> 00:49:05.000
Because a lot of people don't want AI answers.

00:49:05.000 --> 00:49:08.000
They just want the traditional 10 blue links.

00:49:08.000 --> 00:49:15.000
Well, Kagi only provides AI answers if you explicitly ask, or if your query ends with a question mark.

00:49:15.000 --> 00:49:22.000
Which I love, so I'll put Aquarium, and if I want an AI answer from Kagi, I'll put a question mark at the end. If I don't, I don't get it.

00:49:22.000 --> 00:49:28.000
And what makes Kagi interesting is that you can choose which model you want to use.

00:49:28.000 --> 00:49:33.000
So, here, I went into the Coggi settings for the Assistant, that's what they call their AI tool.

00:49:33.000 --> 00:49:39.000
And by default, it's gonna use what Coggi calls the Quick Assistant, the Quick

00:49:39.000 --> 00:49:42.000
model that's built in, and I actually have quite good luck with that.

00:49:42.000 --> 00:49:52.000
But, if you start scrolling down there, you can use Quinn. You can use OpenAI, you can use DeepSeek, you can use Google, you can use Grok if you want, and Mistral.

00:49:52.000 --> 00:49:59.000
And so on. So, you can choose different models to be used, and it remembers the last one you used.

00:49:59.000 --> 00:50:05.000
So how do you use it? Okay, so earlier today, I went to Kagi, and I typed in, what is Slug?

00:50:05.000 --> 00:50:12.000
And it came back. Salute can refer to a few different things. Notice, we're number one. St. Louis Unix Users Group.

00:50:12.000 --> 00:50:17.000
Awesome! Uh, and then it gave some others. Now, if you scroll down a little bit…

00:50:17.000 --> 00:50:23.000
Okay? It says you can continue an assistant. In other words, you can ask questions back…

00:50:23.000 --> 00:50:28.000
about the results. And I actually don't use this all the time, but I have used it, and it's incredibly useful.

00:50:28.000 --> 00:50:35.000
So, it gives you a couple of sample questions. So, I clicked on, what is the St. Louis Unix Users Group?

00:50:35.000 --> 00:50:38.000
Or I could have typed in a follow-up question there.

00:50:38.000 --> 00:50:44.000
And it came back, and after a few seconds, it gave me a much more complete description

00:50:44.000 --> 00:50:48.000
of the St. Louis Unix Group, and seems quite accurate to me.

00:50:48.000 --> 00:50:53.000
And if you scroll down, it gives you references, but on the very bottom there,

00:50:53.000 --> 00:51:02.000
Where I circled quick, that's where you can change the model. So you can either do it in settings, or you can change it after a query.

00:51:02.000 --> 00:51:07.000
And so, when I click on it, again, the one it's currently using is the QUIC model built in to Kaggy.

00:51:07.000 --> 00:51:13.000
Over on the side, they tell you quality speed, privacy, when it was last updated.

00:51:13.000 --> 00:51:15.000
But, if you click on Models…

00:51:15.000 --> 00:51:21.000
It shows you the ones they recommend, but you can scroll down and just start looking at any other ones you want.

00:51:21.000 --> 00:51:24.000
So, like, you know what? We recommend Kimmy.

00:51:24.000 --> 00:51:31.000
It has this quality, this speed, this privacy. Some of these you may have to subscribe to, like OpenAI, the more advanced models.

00:51:31.000 --> 00:51:40.000
Or, uh, if you click on Web Search, you don't… you can say, look, I'm actually not looking for stuff about the web, could you just look on forums?

00:51:40.000 --> 00:51:42.000
Could you just look at academic sources?

00:51:42.000 --> 00:51:45.000
Oh, can you just look at programming sources?

00:51:45.000 --> 00:51:51.000
And so on. So, again, the way Kaggy does it is so intelligent and gives you so much more freedom,

00:51:51.000 --> 00:51:53.000
I wanted to bring that up to everybody.

00:51:53.000 --> 00:51:57.000
Okay, we also have a difference between free and paid.

00:51:57.000 --> 00:51:59.000
So, paid is better than free,

00:51:59.000 --> 00:52:02.000
If you want real usage and fewer limitations.

00:52:02.000 --> 00:52:07.000
So, I mean, within a couple of hours of me using ChatGPT the first time,

00:52:07.000 --> 00:52:12.000
I immediately subscribed, because I could see this is something I want to use a lot.

00:52:12.000 --> 00:52:23.000
Uh, I don't want as many limitations and restrictions. I've been very happy about that. As you heard, and you're gonna see in a minute, JANs is paying the more expensive models for Claude because of the amount he uses it.

00:52:23.000 --> 00:52:29.000
So, ChatGPT has a free version. It also includes ads, and I hate ads.

00:52:29.000 --> 00:52:35.000
They introduced a new version a few months ago called Go. It's only $8 a month, and it includes ads.

00:52:35.000 --> 00:52:43.000
So, you get more than 3, but you don't get as much if you use the $20 a month plus plan, which is what I've been using forever.

00:52:43.000 --> 00:52:48.000
It introduces… it includes, rather, the use of the Codex coding tool,

00:52:48.000 --> 00:52:51.000
which I've used extensively. It's ad-free.

00:52:51.000 --> 00:52:57.000
And then for $200 a month, you get the pro plan, that's everything in Plus, plus unlimited usage.

00:52:57.000 --> 00:53:04.000
Now, I'll be honest with you guys, I've only hit the plus usage a couple of times, where it was like, okay, dude, you need to slow down.

00:53:04.000 --> 00:53:07.000
in months and months and in years.

00:53:07.000 --> 00:53:10.000
So, that plus is pretty available.

00:53:10.000 --> 00:53:12.000
Um, but if you're using it, you know,

00:53:12.000 --> 00:53:15.000
Every moment of the day, okay, $200 a month.

00:53:15.000 --> 00:53:23.000
Now, they also have an API. All of these have API usage. They are billed per million tokens, all of them do that.

00:53:23.000 --> 00:53:30.000
And you can see the API pricing here. So, 40 a cents for every million tokens you put in.

00:53:30.000 --> 00:53:35.000
$1.60 for every million tokens you send out, and it goes up from there.

00:53:35.000 --> 00:53:38.000
Anthropic Claude Claude has a free tier.

00:53:38.000 --> 00:53:46.000
Then, the pro tier is either 20 a month or $204 annually. That includes access to Claude Code and Claude Co-Work.

00:53:46.000 --> 00:53:49.000
Or, the max plan has two different levels.

00:53:49.000 --> 00:53:56.000
$100 a month gets you 5 times the usage of the pro plan, or $200 a month gets you 20 times the usage.

00:53:56.000 --> 00:54:00.000
And you get everything in Pro, plus priority access, so…

00:54:00.000 --> 00:54:04.000
If there's slowdowns or things like that, you get priority access.

00:54:04.000 --> 00:54:07.000
They have an API also, uh…

00:54:07.000 --> 00:54:12.000
There are the prices for that. Now, Jan said he's paying above $100 a month.

00:54:12.000 --> 00:54:17.000
And, um, he… so he does the max plan at $100 a month,

00:54:17.000 --> 00:54:22.000
But he is spending more because of API expenses.

00:54:22.000 --> 00:54:29.000
And recently, he built this really, really cool website for his dad to manage his lock collection.

00:54:29.000 --> 00:54:31.000
His dad has thousands?

00:54:31.000 --> 00:54:33.000
of locks, Jans? I don't know, hundreds at least.

00:54:33.000 --> 00:54:41.000
Oh, yeah, he has probably about 1100 locks and then total items, many, many more keys and that kind of stuff, so…

00:54:41.000 --> 00:54:50.000
Right, and he's had this collection for decades, and Jan's made him, with Claude, a beautiful website tool he can use to manage

00:54:50.000 --> 00:54:53.000
The locks. And if we have time, maybe you could show it to them.

00:54:53.000 --> 00:54:54.000
Sure.

00:54:54.000 --> 00:54:58.000
And one of the features is, is that he has a tool in there that says,

00:54:58.000 --> 00:55:03.000
Go look up information about this lock, and fill in the description part.

00:55:03.000 --> 00:55:09.000
And it works great. He showed it to me, his dad loves it. It's making things very easy for him.

00:55:09.000 --> 00:55:15.000
Uh, Jan's demoed it to me, and correct me if I'm wrong, but you said every single time you run that, that's about 50 cents.

00:55:15.000 --> 00:55:22.000
I actually put a price tracker on it, now I can show you, and it varies, but yeah, okay, you can save 50 cents.

00:55:22.000 --> 00:55:25.000
Well, that's what you said to me.

00:55:25.000 --> 00:55:29.000
So, that's an example right there of the API use. Now,

00:55:29.000 --> 00:55:32.000
Again, how much time is he saving? You're saving…

00:55:32.000 --> 00:55:35.000
vast amounts of time.

00:55:35.000 --> 00:55:40.000
So, I know 50 cents a query might seem expensive, but compare that to the

00:55:40.000 --> 00:55:48.000
potentially hours you would spend on every single lock, and I've seen the quality of the information it provides, and it's excellent.

00:55:48.000 --> 00:55:50.000
It includes citations and so on.

00:55:50.000 --> 00:55:52.000
Uh, Google Gemini?

00:55:52.000 --> 00:55:56.000
Uh, if you use the free version, your data is being used for training.

00:55:56.000 --> 00:56:02.000
If you pay $20 a month, you can get Gemini 3. If you pay Gemini AI Ultra,

00:56:02.000 --> 00:56:07.000
It's $125 a quarter, so $500 a year.

00:56:07.000 --> 00:56:11.000
And that gets you video generation credits, maximum capabilities.

00:56:11.000 --> 00:56:13.000
This is the interesting thing.

00:56:13.000 --> 00:56:22.000
They have the cheapest major API available, Flashlight at 10 cents per million tokens in, and $0.40 per million tokens out.

00:56:22.000 --> 00:56:28.000
So, that's pretty darn good. Now, it is the flashlight, so it's not gonna have all the…

00:56:28.000 --> 00:56:32.000
you know, power of the others, but for simple stuff…

00:56:32.000 --> 00:56:34.000
This is ridiculously cheap.

00:56:34.000 --> 00:56:38.000
Okay, so that's the landscape. We're talking about.

00:56:38.000 --> 00:56:40.000
Now we're gonna build.

00:56:40.000 --> 00:56:44.000
So, let's talk about that. Let me get this…

00:56:44.000 --> 00:56:47.000
Stupid.

00:56:47.000 --> 00:56:51.000
There we go. So I'm gonna move Keynote out of the way.

00:56:51.000 --> 00:56:54.000
So we can focus on what we're gonna do.

00:56:54.000 --> 00:56:57.000
Okay. So…

00:56:57.000 --> 00:57:02.000
What we're gonna do is we are going to build…

00:57:02.000 --> 00:57:05.000
a world clock carousel.

00:57:05.000 --> 00:57:08.000
And so, Jan spent some time…

00:57:08.000 --> 00:57:11.000
Earlier today, or maybe yesterday, too,

00:57:11.000 --> 00:57:14.000
Uh, coming up with the requirements document.

00:57:14.000 --> 00:57:21.000
And so, what I'm gonna start with is, we're gonna start… let me pull my terminal down here for you guys.

00:57:21.000 --> 00:57:23.000
So here's my terminal F.

00:57:23.000 --> 00:57:26.000
We've reduced the size of this, we don't need all of that.

00:57:26.000 --> 00:57:30.000
And so, we… I'm in my folder on my Mac, where we put

00:57:30.000 --> 00:57:33.000
developer projects and things like that.

00:57:33.000 --> 00:57:34.000
And so, yes?

00:57:34.000 --> 00:57:38.000
Scott, Stan had a question about tokens in and tokens out on the API.

00:57:38.000 --> 00:57:40.000
Go ahead.

00:57:40.000 --> 00:57:44.000
Go ahead and answer.

00:57:44.000 --> 00:57:45.000
Go ahead.

00:57:45.000 --> 00:57:47.000
His question is, his question is, what does it mean tokens in and tokens out? I don't know, I don't know the answer to that.

00:57:47.000 --> 00:57:50.000
Um, well, a token…

00:57:50.000 --> 00:57:53.000
as I've read, varies, but it's kind of like a word.

00:57:53.000 --> 00:57:56.000
is how I've always seen the simple explanation.

00:57:56.000 --> 00:58:00.000
So, the actual use… meaning of the token is different.

00:58:00.000 --> 00:58:03.000
But, tell you what…

00:58:03.000 --> 00:58:05.000
Let's do this.

00:58:05.000 --> 00:58:08.000
I'm going to…

00:58:08.000 --> 00:58:11.000
Um, fire up.

00:58:11.000 --> 00:58:14.000
plug the app.

00:58:14.000 --> 00:58:16.000
Hey, Scott returns!

00:58:16.000 --> 00:58:18.000
What exactly…

00:58:18.000 --> 00:58:21.000
is a token.

00:58:21.000 --> 00:58:25.000
I should have put Stan wants to know.

00:58:25.000 --> 00:58:28.000
Basic unit text, not words, tokens!

00:58:28.000 --> 00:58:34.000
Uh, okay, I was wrong. Uh, breaks text into chunks that are easier for a model to handle mathematically.

00:58:34.000 --> 00:58:37.000
Common short words, there you go, are usually one token.

00:58:37.000 --> 00:58:40.000
Longer or rarer words get split.

00:58:40.000 --> 00:58:45.000
Punctuation and spaces are often their own tokens. Numbers can be tokenized.

00:58:45.000 --> 00:58:51.000
So, a rough rule of thumb, one token equals around 3 quarters of a word in English, or about 4 characters.

00:58:51.000 --> 00:58:56.000
So 1,000 tokens is roughly 750 words.

00:58:56.000 --> 00:58:57.000
Cadence?

00:58:57.000 --> 00:59:01.000
So, Scott, I think… I think he was asking about what's the difference between tokens in and tokens out, not what token is.

00:59:01.000 --> 00:59:04.000
Oh, I'm sorry, I thought you were asking what a token is.

00:59:04.000 --> 00:59:05.000
Um, token zip.

00:59:05.000 --> 00:59:07.000
Actually, both works, you know, in and out, what it is.

00:59:07.000 --> 00:59:10.000
Okay. Well, a token in is what you would…

00:59:10.000 --> 00:59:14.000
And you explain. You've shown me there that I figured you'd get to that part. So good.

00:59:14.000 --> 00:59:19.000
Yeah. So, a token in would be your typing instructions, you're telling it, look at these files,

00:59:19.000 --> 00:59:25.000
you know, scan this project, those would be tokens in, tokens out would be what it's giving you.

00:59:25.000 --> 00:59:27.000
How it's responding, let's do it.

00:59:27.000 --> 00:59:28.000
Thank you. Okay, so…

00:59:28.000 --> 00:59:30.000
Great. That's a great explanation. Thank you.

00:59:30.000 --> 00:59:36.000
Um, Vans created a requirements document, which I'm going to show you guys.

00:59:36.000 --> 00:59:42.000
And you're gonna be, again, when Jan said earlier in this talk, when he talked about Hell how much time you spend setting up the requirements,

00:59:42.000 --> 00:59:50.000
You can't just say, hey, build this and expect it to magically build it for you. You're gonna see what's involved in that sort of document.

00:59:50.000 --> 00:59:51.000
So, as I… go ahead.

00:59:51.000 --> 00:59:59.000
Craig pointed out that Tokens in would also be from its search tool for if it went out and grabbed information from the web and pulled more information in to process your request.

00:59:59.000 --> 01:00:04.000
That would be tokens in, and then, of course, it's going to generate a response, and that'd be tokens out.

01:00:04.000 --> 01:00:11.000
Right. So, uh, we are going to be creating a tool that's going to create a world clock.

01:00:11.000 --> 01:00:14.000
It's gonna show you the time in your current…

01:00:14.000 --> 01:00:20.000
Uh, time zone, but they're just gonna show others throughout the world. You'll be able to move it, swipe it back and forth.

01:00:20.000 --> 01:00:23.000
and see, uh, various times in different places.

01:00:23.000 --> 01:00:26.000
So, the very first thing we do is we create a folder,

01:00:26.000 --> 01:00:29.000
for, uh, our world clock.

01:00:29.000 --> 01:00:31.000
And I'm gonna CD in it. And now, I'm going…

01:00:31.000 --> 01:00:34.000
So, hold on, so that's really important.

01:00:34.000 --> 01:00:35.000
Go ahead.

01:00:35.000 --> 01:00:38.000
So, Cloud Code is project-based.

01:00:38.000 --> 01:00:43.000
And it defines the crop project by what directory it's in.

01:00:43.000 --> 01:00:45.000
And if that directory

01:00:45.000 --> 01:00:52.000
Is a GitHub repo, and if there's a Claude MD file, so we're going to do all those things.

01:00:52.000 --> 01:00:55.000
Yep. Okay, let me, um…

01:00:55.000 --> 01:00:57.000
get to that folder…

01:00:57.000 --> 01:01:00.000
And I'll show you guys what I'm dropping in there.

01:01:00.000 --> 01:01:03.000
Okay. So, earlier today,

01:01:03.000 --> 01:01:06.000
Jans made a requirements document.

01:01:06.000 --> 01:01:09.000
And I just dropped it into that folder.

01:01:09.000 --> 01:01:11.000
using the GUI. Alright?

01:01:11.000 --> 01:01:18.000
So, if I do an LS, there's the markdown, and by the way, markdown is used by all of these tools extensively.

01:01:18.000 --> 01:01:26.000
For input, output, so if you're going to be doing this, you gotta learn Markdown. Fortunately, Markdown's not that difficult.

01:01:26.000 --> 01:01:32.000
As you guys should know. So, let's take a look at the document. Now, I'm not gonna read you every aspect of this.

01:01:32.000 --> 01:01:36.000
I'd be happy to make the document available to you guys afterward if you want to study it.

01:01:36.000 --> 01:01:41.000
So, Janice, why don't you walk us through this document, since you're the one that created it?

01:01:41.000 --> 01:01:47.000
Right, so I just came up with the idea of doing a little project to do a world clock web app.

01:01:47.000 --> 01:01:53.000
created this markdown document, just, you know, the file.

01:01:53.000 --> 01:01:56.000
And then started typing in it, you know, what I was thinking.

01:01:56.000 --> 01:02:03.000
You know, world clock, I'm thinking I'll do a carousel 24 world clocks, one for each time zone.

01:02:03.000 --> 01:02:08.000
Um, show an analog clock with a digital clock below it, and label a sit…

01:02:08.000 --> 01:02:11.000
A prominent city in that area, use an emoji.

01:02:11.000 --> 01:02:13.000
flag for, you know, I just kind of…

01:02:13.000 --> 01:02:15.000
put all my thoughts together.

01:02:15.000 --> 01:02:20.000
in no super organized fashion.

01:02:20.000 --> 01:02:21.000
Of course not. No, no, no, no.

01:02:21.000 --> 01:02:23.000
And you weren't this detailed, right? Like, you weren't writing out all of these sentences.

01:02:23.000 --> 01:02:24.000
And you didn't make that table.

01:02:24.000 --> 01:02:25.000
No. No.

01:02:25.000 --> 01:02:28.000
Right? Or did you? No.

01:02:28.000 --> 01:02:29.000
Okay, that's what I thought.

01:02:29.000 --> 01:02:30.000
No, I did not. No.

01:02:30.000 --> 01:02:37.000
And so, so I wrote up my stuff, and then I said, you know, then I popped over to Claude, because I'm working on this in a, in a

01:02:37.000 --> 01:02:40.000
in VS Code is what I was using to type in my…

01:02:40.000 --> 01:02:42.000
My markdown.

01:02:42.000 --> 01:02:46.000
When I popped over to Claude, it was in the same directory and said, hey,

01:02:46.000 --> 01:02:50.000
Um, working on these requirements.

01:02:50.000 --> 01:02:56.000
Take a look and clean this up for me. And so that's what it did, is it kind of organized the stuff, and then from there, I would just type a little bit and expand on it,

01:02:56.000 --> 01:02:59.000
Then ask it, um, say, hey, you know, build a.

01:02:59.000 --> 01:03:02.000
Build a table that has all these.

01:03:02.000 --> 01:03:07.000
Nice. Some asshole joined the… joined it in his, uh…

01:03:07.000 --> 01:03:12.000
Showing porn, we need to click that asshole out.

01:03:12.000 --> 01:03:13.000
Is anybody else seeing this?

01:03:13.000 --> 01:03:19.000
said Brian B, but that's not me doing it.

01:03:19.000 --> 01:03:25.000
I can… well, I could see it, but… Seems to be gone now.

01:03:25.000 --> 01:03:27.000
Okay, cool.

01:03:27.000 --> 01:03:30.000
Alright, go ahead, James.

01:03:30.000 --> 01:03:35.000
Okay. So anyway, yeah, just pretty much just like to be writing it up yourself.

01:03:35.000 --> 01:03:39.000
But then when you get to the parts that would take you a bunch of time to format and

01:03:39.000 --> 01:03:44.000
Word out, you just ask for a little bit of help. If you don't like what it says, you fix it and change it.

01:03:44.000 --> 01:03:47.000
Etc.

01:03:47.000 --> 01:03:48.000
Now,

01:03:48.000 --> 01:03:51.000
Just loop over it a few times, and you have it set a basic requirements.

01:03:51.000 --> 01:03:52.000
Now, I want to…

01:03:52.000 --> 01:03:59.000
Like, and it did, it would, it filled in some things that I didn't want. It said this, this would be nice, I'm like, no, this is what we, we, I definitely want this.

01:03:59.000 --> 01:04:00.000
etc.

01:04:00.000 --> 01:04:02.000
Now, I want you guys to notice what Jan said.

01:04:02.000 --> 01:04:04.000
He had the terminal open,

01:04:04.000 --> 01:04:05.000
With Claude.

01:04:05.000 --> 01:04:07.000
Hang on a second, Scott, you'll need to reshare.

01:04:07.000 --> 01:04:10.000
Oh, right?

01:04:10.000 --> 01:04:13.000
If there was a way to prevent that.

01:04:13.000 --> 01:04:19.000
You can't prevent other people from sharing their screens?

01:04:19.000 --> 01:04:26.000
Apparently not.

01:04:26.000 --> 01:04:30.000
Okay, so everybody sees the screen now? We all good?

01:04:30.000 --> 01:04:33.000
Okay, so, um…

01:04:33.000 --> 01:04:39.000
I want you guys to remember what Jan said. So, he had clawed open in a terminal,

01:04:39.000 --> 01:04:44.000
In the folder where he was, with the requirements doc he was working on,

01:04:44.000 --> 01:04:49.000
He had VS Code open to the same folder, and the file opened.

01:04:49.000 --> 01:04:57.000
And that way, he could see what was happening in BS code to the file as Claude was changing it.

01:04:57.000 --> 01:05:01.000
But that meant he could also edit the file manually.

01:05:01.000 --> 01:05:04.000
And then tell Claude, hey, by the way, I just updated the file.

01:05:04.000 --> 01:05:12.000
go take a look. And so, it wasn't just where he was in the terminal only and letting it go free without involving himself in it.

01:05:12.000 --> 01:05:19.000
He had it also opened in his IDE, VS Code, and was viewing the file and watching what was happening there.

01:05:19.000 --> 01:05:20.000
So I think that's important.

01:05:20.000 --> 01:05:22.000
Of course, if you'd prefer to use Vim or whatever, you can just do it directly in the terminal.

01:05:22.000 --> 01:05:24.000
Of course.

01:05:24.000 --> 01:05:25.000
Right.

01:05:25.000 --> 01:05:28.000
you know, just right away. By the way, uh, Craig asked about, uh…

01:05:28.000 --> 01:05:31.000
CMUCs, we should… we should tell Craig about CMUCs.

01:05:31.000 --> 01:05:32.000
Go ahead.

01:05:32.000 --> 01:05:36.000
CMUCs is a ghosty, um…

01:05:36.000 --> 01:05:37.000
Based.

01:05:37.000 --> 01:05:39.000
Implementation, you know, Ghosties at its base.

01:05:39.000 --> 01:05:40.000
Have you ever heard of Ghostie?

01:05:40.000 --> 01:05:41.000
Yes.

01:05:41.000 --> 01:05:43.000
G-H-O-S-T-T-Y.

01:05:43.000 --> 01:05:44.000
And terminal.

01:05:44.000 --> 01:05:47.000
That's a terminal emulator that I just started using, it's pretty good.

01:05:47.000 --> 01:05:50.000
Yeah, it's really nice. So this is Ghosty with extra stuff.

01:05:50.000 --> 01:05:53.000
And it's just a multiplexing for.

01:05:53.000 --> 01:05:56.000
And I think the C is probably for Claude.

01:05:56.000 --> 01:06:01.000
And the whole idea is that you have workspaces, and each workspace can have panes and tabs.

01:06:01.000 --> 01:06:03.000
So I'll have, I'll have.

01:06:03.000 --> 01:06:09.000
CMUX open with, you know, 8 workspaces, and each workspace has multiple tabs and panes in it.

01:06:09.000 --> 01:06:13.000
And it can… and it has an integrated browser, so you can say, open,

01:06:13.000 --> 01:06:17.000
Open this in a tab, you know, open this in a tab, or open this in a pane,

01:06:17.000 --> 01:06:20.000
And it's the same thing for markdown renderer.

01:06:20.000 --> 01:06:24.000
Scott, do you have the CMUX skills installed?

01:06:24.000 --> 01:06:26.000
I'm… no, I don't think I do.

01:06:26.000 --> 01:06:31.000
Okay, I'll show you how to do that, because then Claude can control CMUX, and you can just say, hey.

01:06:31.000 --> 01:06:32.000
Nice, okay.

01:06:32.000 --> 01:06:38.000
You can say, open this markdown file, and it'll know that when you say to open a markdown file, it opens it up in a tab.

01:06:38.000 --> 01:06:42.000
Nice. So you can see the tabs here, the workspaces.

01:06:42.000 --> 01:06:48.000
Yeah, it might even work, I mean, try, tell, tell Claude to open that markdown file to view it.

01:06:48.000 --> 01:06:49.000
And the world thought. Go back.

01:06:49.000 --> 01:06:51.000
Uh, why don't we get to that in a little bit? Let's get to that in a little bit, okay?

01:06:51.000 --> 01:06:52.000
Okay, that's fine.

01:06:52.000 --> 01:07:00.000
Um, but anyway, yes, it's a very nice terminal emulator. You've got your workspaces, and for instance, here, you've got multiple tabs.

01:07:00.000 --> 01:07:06.000
Oh, Craig, one of the super nice things also is that it has system notification. This is Mac only, by the way.

01:07:06.000 --> 01:07:11.000
It has system notifications so that when Claude is done with something, it'll beep, or show a little…

01:07:11.000 --> 01:07:17.000
Um, toast, and it will, you know, like, flash itself and that kind of stuff, so that if you're running multiple agents with like a.

01:07:17.000 --> 01:07:22.000
I'm doing a lot, then, you know, it'll say, hey, I'm done. I want to talk to you.

01:07:22.000 --> 01:07:25.000
I need permissions.

01:07:25.000 --> 01:07:32.000
Now, Ghosty is both a program, a terminal program, and a library. CMUX is using the Ghosty library.

01:07:32.000 --> 01:07:40.000
Um, but I want you guys to notice, it's ready to run for macOS. There are packages, or you can build from source for Linux.

01:07:40.000 --> 01:07:46.000
So that's Ghost E. CMUX took the Ghosti library and built a bunch of stuff on top of it.

01:07:46.000 --> 01:07:47.000
But Ghosty's quite nice.

01:07:47.000 --> 01:07:48.000
Yeah.

01:07:48.000 --> 01:07:51.000
Okay, so this is a requirements document.

01:07:51.000 --> 01:07:55.000
that Jan's made. Uh, notice he has some layout.

01:07:55.000 --> 01:07:58.000
things in here, about what he wants it to look like,

01:07:58.000 --> 01:08:02.000
What is gonna happen? Uh, and then he wants highlighting.

01:08:02.000 --> 01:08:06.000
He gives information about the analog clock face, and what he wants that to look like.

01:08:06.000 --> 01:08:10.000
Day and night clock face theming. So, by default…

01:08:10.000 --> 01:08:16.000
When you create this, if it's nighttime, where that time zone is, it's gonna be a dark clock,

01:08:16.000 --> 01:08:22.000
If it's currently day, where that time zone is, it's going to be a light clock, and that goes into detail about that at the bottom.

01:08:22.000 --> 01:08:28.000
Uh, notice here, he tells you what he wants to display under each clock face.

01:08:28.000 --> 01:08:32.000
Uh, what the, uh, layout and styling is gonna be.

01:08:32.000 --> 01:08:38.000
Notice the fourth bullet is mobile-first, responsive layout, should work as well on phone screens.

01:08:38.000 --> 01:08:43.000
and desktop. He gives some very smart tech constraints.

01:08:43.000 --> 01:08:50.000
This is a single HTML file with inline CSS and JavaScript. There's no build step, don't overcomplicate this.

01:08:50.000 --> 01:08:51.000
Use vanilla.js,

01:08:51.000 --> 01:08:55.000
You could even pull that out, Scott, if you wanted to, you could yank that section out and…

01:08:55.000 --> 01:09:00.000
Talk to… talk to Claude about making recommendations, but this seems simple for tonight.

01:09:00.000 --> 01:09:04.000
Yeah. And then he has stuff in here that's out of scope.

01:09:04.000 --> 01:09:09.000
So, you know, this is stuff you might think about doing, we don't need to worry about it.

01:09:09.000 --> 01:09:11.000
And success criteria.

01:09:11.000 --> 01:09:13.000
Okay, this is what it should look like.

01:09:13.000 --> 01:09:17.000
And this means you did a good job, and the project is working.

01:09:17.000 --> 01:09:22.000
So real basic, you know, requirements overview for building an app, the same stuff.

01:09:22.000 --> 01:09:25.000
You might write up to discuss with the developer.

01:09:25.000 --> 01:09:27.000
and hash out details.

01:09:27.000 --> 01:09:32.000
Right. But keep in mind, Janz did not just write all that out himself,

01:09:32.000 --> 01:09:39.000
He wrote out the basics, and then by interacting with Claude, and you're gonna see why Claude is good at this…

01:09:39.000 --> 01:09:44.000
He was able to generate this requirements document. And in fact, that's what I'm in the middle of doing.

01:09:44.000 --> 01:09:48.000
right here, see? It's waiting for me to answer questions.

01:09:48.000 --> 01:09:52.000
But I gave it a goal, and it's been going through the process like this.

01:09:52.000 --> 01:09:58.000
It's currently asking clarifying questions, and then once that's done, it can keep going.

01:09:58.000 --> 01:09:59.000
Okay. So…

01:09:59.000 --> 01:10:03.000
But you keep using the word generate, Scott, and it definitely did not generate those requirements document.

01:10:03.000 --> 01:10:05.000
No, I didn't say generate, edited.

01:10:05.000 --> 01:10:08.000
Okay, edited the document. Okay, so, we know the…

01:10:08.000 --> 01:10:13.000
I mean, you could say, hey, generate a requirements document for a world clock, and that's it.

01:10:13.000 --> 01:10:17.000
And it would, and it would do it, it would attempt to do that.

01:10:17.000 --> 01:10:18.000
Right. But it may or may not be…

01:10:18.000 --> 01:10:21.000
But that was, that one was not generated.

01:10:21.000 --> 01:10:25.000
Okay, so we are now gonna, uh…

01:10:25.000 --> 01:10:29.000
Get into Claude. Now, there's actually a command, Claude.

01:10:29.000 --> 01:10:32.000
Okay? Janz and I made an alias.

01:10:32.000 --> 01:10:36.000
for it. He gave me the alias, actually.

01:10:36.000 --> 01:10:42.000
Um, and that is the… that is the alias, so CLD, because by God, we're not going to type any vowels.

01:10:42.000 --> 01:10:50.000
Hell no. And so, uh, when we do claud, CLD, it goes claud dangerously skipped permissions,

01:10:50.000 --> 01:10:54.000
and Claude R, C-L-D-R, means resume from where you were.

01:10:54.000 --> 01:11:00.000
Now, we're going to talk about dangerously skipped permissions, or is now a good time, Janz? Do you want to talk about that?

01:11:00.000 --> 01:11:01.000
Go ahead.

01:11:01.000 --> 01:11:04.000
That's fine. It has a very complex and detailed granular.

01:11:04.000 --> 01:11:09.000
permission structure that you can tell that every time it wants to do something, and it has a ton of tools that it can.

01:11:09.000 --> 01:11:15.000
do all kinds of different things that everyone… every time he uses one of those tools, it'll ask you,

01:11:15.000 --> 01:11:17.000
If you want to allow it to do that.

01:11:17.000 --> 01:11:25.000
And you can say yes, no, or always allow this particular usage of this tool. And it's not just whether it uses the tool, it's exactly how it's using the tool.

01:11:25.000 --> 01:11:29.000
And you can say, yes, no, or always allow you to use the tool this way.

01:11:29.000 --> 01:11:34.000
Um, after a certain amount of time, I think most people that use… use these systems,

01:11:34.000 --> 01:11:38.000
And, um, get comfortable in them, we'll just stop using that entirely.

01:11:38.000 --> 01:11:41.000
People that I know that are using it do this also.

01:11:41.000 --> 01:11:43.000
Because it's like, well, once you have your…

01:11:43.000 --> 01:11:47.000
Scaffolding your infrastructure and your comfort level up.

01:11:47.000 --> 01:11:51.000
Um, it's just too much of a hassle to keep telling it, yes, do this.

01:11:51.000 --> 01:11:59.000
And if you put things in place in your real projects so that you can roll back, like, one of the things that I do,

01:11:59.000 --> 01:12:03.000
is I have it constantly commit stuff and, you know, I have

01:12:03.000 --> 01:12:06.000
staging servers and all this other kind of stuff, so that I can roll back.

01:12:06.000 --> 01:12:09.000
And, you know, fine.

01:12:09.000 --> 01:12:14.000
Yeah. Yep, uh, it will drive you insane after a while, uh, because…

01:12:14.000 --> 01:12:18.000
Every little thing it does with Git, it'll go, shut, can I do this?

01:12:18.000 --> 01:12:20.000
Yes, always do this, or no?

01:12:20.000 --> 01:12:24.000
Um, and then the next thing it doesn't get, that's slightly different, can I do this?

01:12:24.000 --> 01:12:33.000
And you will go insane. It's like having a toddler around. It will constantly pester you about the same thing over and over and over if you let it.

01:12:33.000 --> 01:12:40.000
And so, eventually, once you're comfortable and you get sick of it, you're just like, you know what, just do it. Don't ask me, just do it.

01:12:40.000 --> 01:12:41.000
Okay, so I'm gonna…

01:12:41.000 --> 01:12:43.000
If it's not the same thing, it's slight differences on everything.

01:12:43.000 --> 01:12:45.000
Okay. So…

01:12:45.000 --> 01:12:52.000
I noticed the first time you go to a new folder and start Claude, it wants you to say, okay, do you trust

01:12:52.000 --> 01:12:57.000
This project folder, are you cool with this? Yes, I trust it.

01:12:57.000 --> 01:12:59.000
And I am now in Cloud Code.

01:12:59.000 --> 01:13:01.000
And this is the interface

01:13:01.000 --> 01:13:03.000
or clawed code. Okay?

01:13:03.000 --> 01:13:06.000
So, a couple of things about this.

01:13:06.000 --> 01:13:12.000
If you type a slash, these are how you access various commands and aspects

01:13:12.000 --> 01:13:15.000
of Cloud Code. If you start typing,

01:13:15.000 --> 01:13:18.000
letters, it filters it down.

01:13:18.000 --> 01:13:21.000
And so, one of the first things you're probably going to want to do…

01:13:21.000 --> 01:13:24.000
is install some plugins.

01:13:24.000 --> 01:13:28.000
So, if I go to plug in there, slash plugin…

01:13:28.000 --> 01:13:34.000
It says, okay, here are plugins, there's 130, as you can see there, available.

01:13:34.000 --> 01:13:37.000
And you can go down, and you can, you can…

01:13:37.000 --> 01:13:40.000
Click, uh, enter for details.

01:13:40.000 --> 01:13:49.000
And see what's there. Okay. Oh, okay, code access. Agent that simplifies and refines code for clarity, consistency, and maintainability.

01:13:49.000 --> 01:13:54.000
Okay, cool. Or, I can say, go back to the plugin list.

01:13:54.000 --> 01:14:00.000
Then, you can go to Installed. These are the ones I have installed.

01:14:00.000 --> 01:14:04.000
Okay? And these are ones we would recommend you install.

01:14:04.000 --> 01:14:06.000
So, context 7.

01:14:06.000 --> 01:14:11.000
Uh, Jance, you know more about these than I do. Uh, Context 7…

01:14:11.000 --> 01:14:13.000
As you can see there, looks up documentation.

01:14:13.000 --> 01:14:21.000
So, it knows that, oh, there are coding in Rust, I better go look at the Rust documentation.

01:14:21.000 --> 01:14:23.000
So I know how to use this most effectively.

01:14:23.000 --> 01:14:27.000
And so on. So, I already have that installed.

01:14:27.000 --> 01:14:33.000
Um, and Lance, you said… could you tell them how, when the plug-ins get invoked?

01:14:33.000 --> 01:14:36.000
the plugins get invoked when they're needed.

01:14:36.000 --> 01:14:40.000
Or when you ask for them using natural language, you know.

01:14:40.000 --> 01:14:46.000
So if you say a synonym or something that it thinks that this plugin would be appropriate for,

01:14:46.000 --> 01:14:53.000
It'll use it, and it'll tell you which plugins it's using, and if it doesn't use a plugin, then you can tell it specifically. No, no, look this up in Context 7.

01:14:53.000 --> 01:14:57.000
or use your skill creator skill to do this.

01:14:57.000 --> 01:15:02.000
You know, but you should just see them get invoked in appropriate ways.

01:15:02.000 --> 01:15:06.000
If it invokes a skill inappropriately, which I don't think I've ever seen,

01:15:06.000 --> 01:15:10.000
I think it's always the other way around, that it skips to use… it forgets to use a skill,

01:15:10.000 --> 01:15:13.000
or doesn't use it, then, um…

01:15:13.000 --> 01:15:14.000
You can tell. Yeah.

01:15:14.000 --> 01:15:17.000
You could stop it. You could escape it anytime and tell it to stop.

01:15:17.000 --> 01:15:27.000
Yep. Uh, Playwright is one if you're doing web development. It actually opens a browser up, and it can control the browser. It uses it for testing or for demonstrations.

01:15:27.000 --> 01:15:28.000
That's really nice to have.

01:15:28.000 --> 01:15:30.000
The puppeteer.

01:15:30.000 --> 01:15:38.000
Now, let's talk about superpowers, because this is one that's incredibly recommended, I think, by JANs and many, many, many, many others.

01:15:38.000 --> 01:15:39.000
Um, go ahead and talk about that for a sec.

01:15:39.000 --> 01:15:45.000
Yeah, so Superpowers is a big set of skills.

01:15:45.000 --> 01:15:51.000
that essentially establishes methodology for doing lots of common coding tasks and workflows.

01:15:51.000 --> 01:15:59.000
Um, it used to have a lot more in it that has gotten integrated directly into Claude. Superpowers was a.

01:15:59.000 --> 01:16:02.000
Official Claude.

01:16:02.000 --> 01:16:04.000
plug-in, um…

01:16:04.000 --> 01:16:08.000
But other people were wanting to contribute to it so much, they spun it off.

01:16:08.000 --> 01:16:15.000
And and now they have a wider group that's contributing to it, but Claude will then roll stuff in.

01:16:15.000 --> 01:16:18.000
So, for instance, initially subbed agents were all handled

01:16:18.000 --> 01:16:23.000
and invoked by superpowers, and now that's built right into.

01:16:23.000 --> 01:16:29.000
into Claude. I'm sorry, I should say Claude code. Every time I say Claude, I really should be saying Claude code.

01:16:29.000 --> 01:16:33.000
Um, but that's what it is. So superpowers is a whole bunch of different.

01:16:33.000 --> 01:16:38.000
Markdown skills. All a skill is, is its guidance on how to do something.

01:16:38.000 --> 01:16:41.000
It's a set of instructions in normal.

01:16:41.000 --> 01:16:48.000
language written in Markdown that says, these are the things that you need to do when you do this. And so, like I said,

01:16:48.000 --> 01:16:52.000
It's methodology. Skills have become very much methodology. Hey,

01:16:52.000 --> 01:16:58.000
For brainstorming, you do this. You read this, you ask these questions, you follow up with that, you take notes, whatever,

01:16:58.000 --> 01:17:01.000
And then it moves on. So writing plans, writing skills…

01:17:01.000 --> 01:17:06.000
Sorry, when they say writing skills, we're talking about writing Claude skills.

01:17:06.000 --> 01:17:10.000
executing plans, etc. All of those things.

01:17:10.000 --> 01:17:14.000
Yep. So, it's one you definitely want, because it does so much.

01:17:14.000 --> 01:17:20.000
And when I say builds it into Claude, Claude, one of the big things about Claude code is not the model, it's the…

01:17:20.000 --> 01:17:23.000
They call the harness. It's all this… all this stuff.

01:17:23.000 --> 01:17:26.000
that Claude Code does. If you ever want to check it out,

01:17:26.000 --> 01:17:32.000
They have really good documentation. Just say how Claude code works, and they walk you through all the little pieces, hooks,

01:17:32.000 --> 01:17:37.000
and tools, and how all of these things are invoked, and how it works.

01:17:37.000 --> 01:17:39.000
It's probably more than we didn't want to go into tonight.

01:17:39.000 --> 01:17:42.000
But good reading, and it's really well written.

01:17:42.000 --> 01:17:43.000
I bet Claude helped him.

01:17:43.000 --> 01:17:51.000
Now, one thing that's very important, though, is you'll notice that Context 7 is coming from the Claude Plugin's official

01:17:51.000 --> 01:17:53.000
marketplace that's built in.

01:17:53.000 --> 01:17:59.000
Um, there is a Super Powers available in Claude Plugins Official. Do not use it.

01:17:59.000 --> 01:18:02.000
Instead, you want to go to the marketplace,

01:18:02.000 --> 01:18:08.000
And you want to search for, or add, rather, superpowers-marketplace.

01:18:08.000 --> 01:18:18.000
You do not want to use the one built into Claude Plugins Official. You want, because it's old, you want to use the one that's available in Superpowers Marketplace, so you need to go to Marketplace.

01:18:18.000 --> 01:18:22.000
Look at your marketplaces, Scott.

01:18:22.000 --> 01:18:23.000
I mean, these two?

01:18:23.000 --> 01:18:25.000
Um, right, uh…

01:18:25.000 --> 01:18:29.000
It's OBRA, that O-B-R-A. That's the thing to look for, yeah.

01:18:29.000 --> 01:18:31.000
So, search for Ed Marketplace.

01:18:31.000 --> 01:18:34.000
You know, you see it right there, so Obero Superpowers Marketplace.

01:18:34.000 --> 01:18:35.000
Oh, Oprah's, yeah.

01:18:35.000 --> 01:18:37.000
Yeah.

01:18:37.000 --> 01:18:40.000
So, search for OBRA, not superpowers Marketplace.

01:18:40.000 --> 01:18:41.000
Yeah. Okay.

01:18:41.000 --> 01:18:51.000
Okay. So, you click on Add Marketplace, you type OBRA, you add that Marketplace, and now we can get plugins from two sources, the Claude Plugins Official,

01:18:51.000 --> 01:18:54.000
or superpowers Marketplace.

01:18:54.000 --> 01:18:57.000
So, we definitely recommend that.

01:18:57.000 --> 01:18:59.000
Hold on, I'm trying to get this…

01:18:59.000 --> 01:19:04.000
stupid, uh, zoom bar. Okay, so that's plugins.

01:19:04.000 --> 01:19:10.000
Jan's any others we should talk about real quick?

01:19:10.000 --> 01:19:11.000
Yeah.

01:19:11.000 --> 01:19:18.000
Plugins? No, I would say just keep your plugins pretty lean. Don't put a bunch of plugins thinking, oh boy, this is going to be really, really powerful.

01:19:18.000 --> 01:19:21.000
The more plugins they add, the more context you eat up.

01:19:21.000 --> 01:19:28.000
More is not necessarily better. You want it to be appropriate to what you're doing.

01:19:28.000 --> 01:19:30.000
That's right.

01:19:30.000 --> 01:19:36.000
Now, let's talk about config for a minute, because you click on that, and these are configurations.

01:19:36.000 --> 01:19:41.000
that you can make for the, um, Hell Clawed acts, what it does.

01:19:41.000 --> 01:19:45.000
So, for instance, Auto Compact, periodically, it'll go…

01:19:45.000 --> 01:19:52.000
We're getting a lot of conversation here, let's compact that to reduce it, and it'll do that for you. So you want that on.

01:19:52.000 --> 01:19:53.000
One that…

01:19:53.000 --> 01:19:58.000
It has to do that once it runs out of context for your current conversation and task.

01:19:58.000 --> 01:20:05.000
But one Jans pointed out today that we might want to do tonight is, for output style, you have

01:20:05.000 --> 01:20:06.000
Oops, I hit the wrong…

01:20:06.000 --> 01:20:10.000
You need the spacebar Scott, but before you do that, you should run init.

01:20:10.000 --> 01:20:14.000
Okay. So, we're in the folder.

01:20:14.000 --> 01:20:18.000
I'm gonna run init, and that's gonna initialize a new CloudMD file.

01:20:18.000 --> 01:20:24.000
ClaudeMD is the markdown file that Claude looks for to understand how it's going to do things.

01:20:24.000 --> 01:20:27.000
So, you click it…

01:20:27.000 --> 01:20:31.000
It's very funny, by the way, notice it comes up with quite funny terms.

01:20:31.000 --> 01:20:36.000
juliening, and so on.

01:20:36.000 --> 01:20:50.000
So, you can read what it… it tells you what it's doing.

01:20:50.000 --> 01:20:53.000
Okay, it created a 58-line…

01:20:53.000 --> 01:20:55.000
ClaudeMD file.

01:20:55.000 --> 01:20:59.000
Okay? And there you go.

01:20:59.000 --> 01:21:00.000
Now, I'll show you the document.

01:21:00.000 --> 01:21:03.000
See what it says up there, Scott? I mean, read from the top.

01:21:03.000 --> 01:21:04.000
The project only has a requirements file, no code's been written yet, the CloudMD

01:21:04.000 --> 01:21:08.000
That's kind of…

01:21:08.000 --> 01:21:11.000
should document what's being built and key constraints.

01:21:11.000 --> 01:21:13.000
It rode 58 lines.

01:21:13.000 --> 01:21:18.000
QuotMD, and then it describes down below there, since the project only has requirements and no code,

01:21:18.000 --> 01:21:23.000
The file documents, the single file constraint, why it matters, how to run it,

01:21:23.000 --> 01:21:30.000
The intended architecture, so future sessions can build it consistently, and the full clock data table for quick reference.

01:21:30.000 --> 01:21:33.000
Right, and so that's just feedback for you.

01:21:33.000 --> 01:21:41.000
So the only thing it did is write the Claude MD file. Now, I would also, anytime you're doing a Claude project, just make sure that it gets set up, so you can just tell it, you know,

01:21:41.000 --> 01:21:43.000
Initialize this project as a Git repo.

01:21:43.000 --> 01:21:46.000
you know, just tell it, you know, make this a git repo.

01:21:46.000 --> 01:21:47.000
Want me to go ahead and do that?

01:21:47.000 --> 01:21:59.000
Sure.

01:21:59.000 --> 01:22:04.000
Done. So, just…

01:22:04.000 --> 01:22:05.000
Of course.

01:22:05.000 --> 01:22:07.000
Right, and so, I mean, you can do that yourself, you know, you could do that at command line, but, you know, and you see what it did also.

01:22:07.000 --> 01:22:13.000
Now let's go ahead and turn on that explanatory mode.

01:22:13.000 --> 01:22:18.000
Alright. And that was… uh, con-config.

01:22:18.000 --> 01:22:28.000
And let's go down here to Output Style, and hit Spacebar. So, these are the three ways Klob will communicate with you. By default, it uses the default one.

01:22:28.000 --> 01:22:33.000
Janz was suggesting we use the explanatory communication style.

01:22:33.000 --> 01:22:35.000
Why'd you choose that, Jans, for this?

01:22:35.000 --> 01:22:39.000
Well, because it explains what's going on.

01:22:39.000 --> 01:22:40.000
Okay, perfectly for what we're doing tonight.

01:22:40.000 --> 01:22:42.000
I mean, there you go.

01:22:42.000 --> 01:22:50.000
You know, you would also use this if you're trying to learn stuff. Learning puts it one further, where it will flag stuff for human coding.

01:22:50.000 --> 01:22:55.000
And it'll do stuff and say, hey, human, you do this part.

01:22:55.000 --> 01:23:00.000
Yep, ask you to write small pieces of code for hands-on practice, which is pretty cool.

01:23:00.000 --> 01:23:03.000
Alright, that's config.

01:23:03.000 --> 01:23:07.000
We are in the project. I did want to show you guys…

01:23:07.000 --> 01:23:08.000
Um…

01:23:08.000 --> 01:23:10.000
Robert asked about how do you install Claude?

01:23:10.000 --> 01:23:11.000
Ah! Good question.

01:23:11.000 --> 01:23:14.000
We had that in the plan, I don't know what, what happened to that, Scotty.

01:23:14.000 --> 01:23:18.000
I didn't have time to put it in here, because we had little time.

01:23:18.000 --> 01:23:22.000
Uh, go to claw.ai.

01:23:22.000 --> 01:23:27.000
Okay? And go to Cloud Code, is that correct, James?

01:23:27.000 --> 01:23:33.000
I don't know, I just… I searched Kagi for install Claude and get the… it's just a curl command.

01:23:33.000 --> 01:23:34.000
Yeah.

01:23:34.000 --> 01:23:36.000
It just runs a script, it's just a bash script.

01:23:36.000 --> 01:23:37.000
to install it.

01:23:37.000 --> 01:23:46.000
Yup. It should give it to us right here, I believe.

01:23:46.000 --> 01:23:49.000
It's taken a long time.

01:23:49.000 --> 01:23:51.000
Okay, right there.

01:23:51.000 --> 01:23:52.000
That's all you do. Copy that, run it.

01:23:52.000 --> 01:23:54.000
There you go.

01:23:54.000 --> 01:23:58.000
And it goes, and now you have Claude Code available.

01:23:58.000 --> 01:24:01.000
Good question. Okay.

01:24:01.000 --> 01:24:03.000
So, um…

01:24:03.000 --> 01:24:05.000
Let me…

01:24:05.000 --> 01:24:07.000
Uh, I want to show you guys

01:24:07.000 --> 01:24:10.000
that…

01:24:10.000 --> 01:24:13.000
Um, what that document it made looked like.

01:24:13.000 --> 01:24:16.000
So, let's look at Claude.

01:24:16.000 --> 01:24:17.000
Indeed.

01:24:17.000 --> 01:24:19.000
Why did you go to a different workspace, Scott? That's weird.

01:24:19.000 --> 01:24:21.000
You want me to do it right from in there?

01:24:21.000 --> 01:24:23.000
That's what, that's what CMUX is for.

01:24:23.000 --> 01:24:25.000
So, less…

01:24:25.000 --> 01:24:26.000
Oh, tab it. New tab.

01:24:26.000 --> 01:24:29.000
No. Yeah, just open up open a few tabs.

01:24:29.000 --> 01:24:33.000
Right, okay, whatever.

01:24:33.000 --> 01:24:35.000
That's the tool, man.

01:24:35.000 --> 01:24:43.000
So, there's the CloudMD file, it's added.

01:24:43.000 --> 01:24:48.000
Why is it doing that?

01:24:48.000 --> 01:24:52.000
Why is it doing that?

01:24:52.000 --> 01:24:54.000
Okay, I have no idea.

01:24:54.000 --> 01:24:57.000
Why, when I type less, it's doing that.

01:24:57.000 --> 01:24:59.000
Okay, I'll cat it, what the hell?

01:24:59.000 --> 01:25:00.000
There you go.

01:25:00.000 --> 01:25:02.000
Oh, you have some…

01:25:02.000 --> 01:25:05.000
filter that's trying to, um…

01:25:05.000 --> 01:25:07.000
render the markdown nicer than just text.

01:25:07.000 --> 01:25:09.000
Oh, got it.

01:25:09.000 --> 01:25:13.000
Um, but you're probably missing the…

01:25:13.000 --> 01:25:25.000
Yep. Okay, so this is the CloudMD file that it just generated. You just saw… you just saw it generate. There's the description of the project. Here are the tech constraints.

01:25:25.000 --> 01:25:26.000
Okay.

01:25:26.000 --> 01:25:27.000
It looks like it mostly just copied the markdown file right in.

01:25:27.000 --> 01:25:28.000
Yeah. Yeah.

01:25:28.000 --> 01:25:29.000
Which is fine for now.

01:25:29.000 --> 01:25:32.000
It looks pretty… yeah.

01:25:32.000 --> 01:25:33.000
Okay. So, back to where we are.

01:25:33.000 --> 01:25:45.000
So, Claudium… so Claude MD, really what it's for is that when you… when you start Claude in this app, it reads that into context, so it knows what's going on in this project without you having to tell it every time.

01:25:45.000 --> 01:25:48.000
Um, and so what it has in there right now is fine.

01:25:48.000 --> 01:25:50.000
Okay? Okay.

01:25:50.000 --> 01:25:54.000
We're ready to actually start, and so we wrote this out ahead of time.

01:25:54.000 --> 01:25:58.000
So, help me plan and brainstorm this single-page web app.

01:25:58.000 --> 01:26:04.000
C at World Clock Carousel for the initial requirements. I better put an MD at the end of that, or is that enough?

01:26:04.000 --> 01:26:08.000
No, I… you should probably shouldn't have pasted that in. But anyway, go ahead.

01:26:08.000 --> 01:26:10.000
Okay. Because you want me to type it out?

01:26:10.000 --> 01:26:14.000
Well, yeah, when you start doing the app, like I said, it'll autocomplete.

01:26:14.000 --> 01:26:18.000
Right. Let me get rid of that.

01:26:18.000 --> 01:26:20.000
Okay…

01:26:20.000 --> 01:26:28.000
Okay. So, initial requirements… I should have done the app here.

01:26:28.000 --> 01:26:31.000
Nevermind.

01:26:31.000 --> 01:26:35.000
Go back, I think that'll do it, and there you go.

01:26:35.000 --> 01:26:39.000
There you go. And, uh, I'll leave it at the end, whatever, it shouldn't matter.

01:26:39.000 --> 01:26:46.000
Okay, so there's the file for the initial requirements. Don't forget to use the writing, HTML, and CSS custom skill.

01:26:46.000 --> 01:26:50.000
Talking about skills, this is one I believe you created, is that correct?

01:26:50.000 --> 01:26:53.000
Yeah, I did it, it's just a style skill.

01:26:53.000 --> 01:26:57.000
Just saying, hey, this is how I prefer you to write HTML and CSS.

01:26:57.000 --> 01:27:02.000
Okay. Go.

01:27:02.000 --> 01:27:08.000
Okay, notice it loaded the Super Powers Brainstorming skill, because we said we want to brainstorm.

01:27:08.000 --> 01:27:14.000
And notice it loaded the writing HTML CSS skill, because we mentioned that.

01:27:14.000 --> 01:27:19.000
Thinking, thinking, thinking…

01:27:19.000 --> 01:27:24.000
And you didn't have to put that right in HTML and CSS skill in this initial prompt, either.

01:27:24.000 --> 01:27:29.000
You can just tell it to do it, and at some point in time, say, you know, I don't see that you mentioned this in the…

01:27:29.000 --> 01:27:31.000
In the.

01:27:31.000 --> 01:27:34.000
Plans, you know, be sure to include that.

01:27:34.000 --> 01:27:37.000
Okay? So, notice what it says.

01:27:37.000 --> 01:27:41.000
Some of what we're working on might be easier to explain if I can show it to you in a web browser.

01:27:41.000 --> 01:27:44.000
Mockups, diagrams, comparisons, other visuals,

01:27:44.000 --> 01:27:47.000
Uh, want to try it? And I'm gonna say…

01:27:47.000 --> 01:27:49.000
Yes.

01:27:49.000 --> 01:27:50.000
Okay.

01:27:50.000 --> 01:27:55.000
So, it's gonna open my browser.

01:27:55.000 --> 01:27:57.000
Would one of my browsers somewhere.

01:27:57.000 --> 01:28:01.000
We'll see what it does.

01:28:01.000 --> 01:28:12.000
We'll see.

01:28:12.000 --> 01:28:18.000
Okay, it says it's completed the first two steps.

01:28:18.000 --> 01:28:20.000
Note, it's now on.

01:28:20.000 --> 01:28:24.000
And notice it's telling you how many tokens it's using? Do you see that?

01:28:24.000 --> 01:28:29.000
So it tells you how long it's been working, and how many tokens it's used, and what, you know, it's thinking.

01:28:29.000 --> 01:28:42.000
We're currently at step 3, so it's looking through everything and thinking, hmm, what questions should I ask to understand this better?

01:28:42.000 --> 01:28:47.000
Well, that was nice of it to add a file to my gitignore for the project.

01:28:47.000 --> 01:28:50.000
Okay, open local host…

01:28:50.000 --> 01:28:53.000
In your, in your browser…

01:28:53.000 --> 01:28:59.000
Let's do that…

01:28:59.000 --> 01:29:03.000
Put that open one moment…

01:29:03.000 --> 01:29:06.000
I've never seen this before, by the way.

01:29:06.000 --> 01:29:08.000
There you go.

01:29:08.000 --> 01:29:11.000
That's what I got a little while ago in the other one.

01:29:11.000 --> 01:29:13.000
Boom!

01:29:13.000 --> 01:29:16.000
Alright, so let's go back to this.

01:29:16.000 --> 01:29:19.000
So, question 1 of 4.

01:29:19.000 --> 01:29:25.000
You'll see 3 color palette accent options, each showing a mini clock car with the accent color applied to the second hand and the

01:29:25.000 --> 01:29:27.000
Home Clock Top Border Treatment.

01:29:27.000 --> 01:29:29.000
Uh, do you want to go with Cool Blue?

01:29:29.000 --> 01:29:35.000
Warm Ember, or slate and teal, click one to select.

01:29:35.000 --> 01:29:38.000
Alright, so let's go here.

01:29:38.000 --> 01:29:44.000
Which one do you guys want? Do you guys want cool blue, warm amber, or slate and teal?

01:29:44.000 --> 01:29:47.000
By the way, you don't have to choose any of those, Scott, you could just…

01:29:47.000 --> 01:29:48.000
Of course.

01:29:48.000 --> 01:29:56.000
prompt right back in there, I'm like, no, no, I don't want any of these, I'm thinking much more of a, you know, whatever.

01:29:56.000 --> 01:29:57.000
Well, because your monitor's not very good, I guess.

01:29:57.000 --> 01:29:59.000
How can we all look the same?

01:29:59.000 --> 01:30:00.000
Because Zoom is compressing it.

01:30:00.000 --> 01:30:02.000
You can't.

01:30:02.000 --> 01:30:04.000
If you look right here, right here…

01:30:04.000 --> 01:30:08.000
The hands are different colors.

01:30:08.000 --> 01:30:10.000
There's also a different… different shade in the background.

01:30:10.000 --> 01:30:11.000
Yeah, different background shades, pretty subtle.

01:30:11.000 --> 01:30:12.000
It's pretty.

01:30:12.000 --> 01:30:13.000
But not on the other side.

01:30:13.000 --> 01:30:20.000
Do you want me to tell it? Let's make it more obvious?

01:30:20.000 --> 01:30:35.000
Okay.

01:30:35.000 --> 01:30:43.000
So that'll be interesting to see what from that prompt, but I would have said make the color differences more obvious, but…

01:30:43.000 --> 01:30:44.000
Okay.

01:30:44.000 --> 01:30:52.000
It might make them bigger.

01:30:52.000 --> 01:30:55.000
Still thinking.

01:30:55.000 --> 01:30:59.000
Oh, just updated some code.

01:30:59.000 --> 01:31:00.000
That's more obvious.

01:31:00.000 --> 01:31:05.000
Yeah, same colors, no, no, it did not change the color differences, but it certainly made them.

01:31:05.000 --> 01:31:11.000
You guys want me to do it again and say, increase the color differences?

01:31:11.000 --> 01:31:13.000
Well, the second hands are definitely different colors now.

01:31:13.000 --> 01:31:16.000
Okay, if you guys are happy with that…

01:31:16.000 --> 01:31:22.000
Which one do we want? I'm gonna go with warm amber, unless somebody says,

01:31:22.000 --> 01:31:25.000
or a member okay?

01:31:25.000 --> 01:31:28.000
Okay, that's what we're doing. So notice it said click on it.

01:31:28.000 --> 01:31:31.000
So I clicked on it, let's see what it did.

01:31:31.000 --> 01:31:33.000
Um…

01:31:33.000 --> 01:31:38.000
Okay, it told me what it did. So, clicking on it didn't do a thing, except show me what it looks like.

01:31:38.000 --> 01:31:50.000
So I'm gonna go back here and say, warm amber.

01:31:50.000 --> 01:31:52.000
Okay? Great choice!

01:31:52.000 --> 01:31:57.000
approve of what I'm doing.

01:31:57.000 --> 01:31:58.000
Next question. The requirements list several options for how to visually distinguish the home clock. Let me show you the realistic options side by side.

01:31:58.000 --> 01:32:05.000
The realistic options…

01:32:05.000 --> 01:32:08.000
Let's see what we get…

01:32:08.000 --> 01:32:09.000
Yep. Not unrealistic offense.

01:32:09.000 --> 01:32:16.000
I wanna see unrealistic options.

01:32:16.000 --> 01:32:19.000
Realistic options.

01:32:19.000 --> 01:32:21.000
Well, it might have used negative colors.

01:32:21.000 --> 01:32:23.000
that don't exist.

01:32:23.000 --> 01:32:28.000
It has certain words it loves to use, like the word quality.

01:32:28.000 --> 01:32:33.000
This has this, it has this quality.

01:32:33.000 --> 01:32:40.000
Okay. So, question 2 of 4, 3 home clock highlight treatments, each shown with neighboring cards for context.

01:32:40.000 --> 01:32:44.000
Amber top border and shadow, amber background tint plus home icon.

01:32:44.000 --> 01:32:49.000
Larger scale and amber left border.

01:32:49.000 --> 01:32:53.000
Okay. So, do we want that for our home?

01:32:53.000 --> 01:32:55.000
or that…

01:32:55.000 --> 01:32:58.000
or that.

01:32:58.000 --> 01:33:02.000
What do you guys like? So here we got the border on top.

01:33:02.000 --> 01:33:05.000
Right? The, uh…

01:33:05.000 --> 01:33:12.000
the word home, along with New York. Here they put the home icon by New York, I don't like that as much.

01:33:12.000 --> 01:33:16.000
Here, they put the amber border on the side.

01:33:16.000 --> 01:33:21.000
20% bigger than its neighbors.

01:33:21.000 --> 01:33:23.000
You like that first one?

01:33:23.000 --> 01:33:27.000
Okay? But is that okay with everybody?

01:33:27.000 --> 01:33:35.000
Okay? Let's go with A.

01:33:35.000 --> 01:33:36.000
Glant?

01:33:36.000 --> 01:33:40.000
You know, Scott, you could tell it. Yeah, I clicked.

01:33:40.000 --> 01:33:44.000
Oh yeah, that's a good point. Yeah, I clicked and nothing's happening.

01:33:44.000 --> 01:33:49.000
You don't even have to tell it that. Okay, I clicked.

01:33:49.000 --> 01:33:52.000
But it's kind of in the middle of something now, it's too late now.

01:33:52.000 --> 01:33:56.000
But I can go ahead and send it.

01:33:56.000 --> 01:34:13.000
Sure. Um…

01:34:13.000 --> 01:34:15.000
Okay, one more visual question.

01:34:15.000 --> 01:34:16.000
What's that?

01:34:16.000 --> 01:34:18.000
Is happening to the word? It's happened to the word?

01:34:18.000 --> 01:34:20.000
It is now.

01:34:20.000 --> 01:34:25.000
It is now. It'll know what I mean.

01:34:25.000 --> 01:34:26.000
It'll have no… it has very little problem with spellings.

01:34:26.000 --> 01:34:28.000
I guess.

01:34:28.000 --> 01:34:29.000
Or typos in this case, because I know how a spell happened.

01:34:29.000 --> 01:34:30.000
Right. So, by the way, doing something like that can interrupt its work.

01:34:30.000 --> 01:34:37.000
Yeah, I've noticed it in chat.

01:34:37.000 --> 01:34:42.000
If you do that while it's in the middle of doing some stuff and it can, it can, uh.

01:34:42.000 --> 01:34:47.000
Slow things down for you, it usually can recover, but you can do a slash BTW to

01:34:47.000 --> 01:34:48.000
Right, by the way. That's what I should have done.

01:34:48.000 --> 01:34:54.000
To say, you know, put something in there and then it will not interrupt itself.

01:34:54.000 --> 01:34:59.000
So, the click just highlights your selection in the browser as a visual record. You still tell me here in the terminal.

01:34:59.000 --> 01:35:00.000
Okay.

01:35:00.000 --> 01:35:05.000
Sorry, that wasn't clear! Okay, question 3 of 4. Day versus night clock.

01:35:05.000 --> 01:35:11.000
Face side, side-by-side for 3 darkness levels. Soft Navy, deep midnight, Slate Charcoal,

01:35:11.000 --> 01:35:20.000
Which nighttime face do you prefer?

01:35:20.000 --> 01:35:21.000
If you tell it.

01:35:21.000 --> 01:35:22.000
also deal with colorblind colors.

01:35:22.000 --> 01:35:24.000
something that it needs to run.

01:35:24.000 --> 01:35:25.000
Okay.

01:35:25.000 --> 01:35:29.000
Better yet, you should have a skill that tells it to you without you having to tell it every time.

01:35:29.000 --> 01:35:30.000
Yep, absolutely.

01:35:30.000 --> 01:35:33.000
Yeah, right? Oh, yeah.

01:35:33.000 --> 01:35:38.000
Yeah. Okay, what do people think?

01:35:38.000 --> 01:35:42.000
I don't see a huge bit of difference, to be honest.

01:35:42.000 --> 01:35:46.000
Anybody? Got a choice?

01:35:46.000 --> 01:35:50.000
Okay, we'll go with 2.

01:35:50.000 --> 01:35:53.000
Okay? Uh, go back, Kevin.

01:35:53.000 --> 01:35:58.000
When did it decide to add the numbers to the clock face? They weren't there earlier.

01:35:58.000 --> 01:35:59.000
Prove it.

01:35:59.000 --> 01:36:01.000
Here.

01:36:01.000 --> 01:36:03.000
Yeah, I don't have a picture from earlier.

01:36:03.000 --> 01:36:05.000
So…

01:36:05.000 --> 01:36:07.000
Yeah, earlier I noticed there was no numbers on the clock faces.

01:36:07.000 --> 01:36:12.000
Yeah. That didn't matter for what it was doing there.

01:36:12.000 --> 01:36:13.000
Okay.

01:36:13.000 --> 01:36:20.000
Who knows why it didn't have numbers on there? I was interested to see, see what it's going to put on here.

01:36:20.000 --> 01:36:24.000
Okay, we said we prefer deep midnight B.

01:36:24.000 --> 01:36:28.000
That was question 3 of 4.

01:36:28.000 --> 01:36:32.000
Okay…

01:36:32.000 --> 01:36:36.000
What type of Linux am I running? I'm on a Mac.

01:36:36.000 --> 01:36:42.000
Can I get that clock face and camera cornflower blue? We could!

01:36:42.000 --> 01:36:48.000
Alright, let me push a waiting screen while we move to the terminal for the architecture discussion.

01:36:48.000 --> 01:37:02.000
Design decisions locked in.

01:37:02.000 --> 01:37:04.000
Okay, continuing in terminal.

01:37:04.000 --> 01:37:06.000
So let's go back here and look.

01:37:06.000 --> 01:37:10.000
Key architectural questions. Since everything lives in a single script block,

01:37:10.000 --> 01:37:13.000
How do we structure the clock rendering and updates? Three options.

01:37:13.000 --> 01:37:16.000
flat procedural plus set interval?

01:37:16.000 --> 01:37:24.000
A flat array of 24 o'clock data objects on page load loops 1 to EnterHTML, all 24 cards, each second, second…

01:37:24.000 --> 01:37:29.000
Uh, on the SVG hand element, so Jans, we were talking about that, what it was gonna pick.

01:37:29.000 --> 01:37:34.000
So it went with SVG, makes sense.

01:37:34.000 --> 01:37:36.000
Um… or…

01:37:36.000 --> 01:37:42.000
Clock objects with request animation frame.

01:37:42.000 --> 01:37:43.000
Notice the nice gives us the pros and the cons?

01:37:43.000 --> 01:37:45.000
Mm-hmm.

01:37:45.000 --> 01:37:51.000
More complex, a sweeping second on our road clock feels like over-engineering.

01:37:51.000 --> 01:37:57.000
Okay… CSS Custom Properties for everything, itchards SPG element gets CSS custom properties…

01:37:57.000 --> 01:38:01.000
Updated by JS, day-night things, high CSS Proven, JS…

01:38:01.000 --> 01:38:04.000
Pros, separation of styling, CSS transitions,

01:38:04.000 --> 01:38:12.000
CSS animation sync to wall clock time is genuinely tricky to implement correctly in a single file.

01:38:12.000 --> 01:38:15.000
Now, I want you guys to notice something.

01:38:15.000 --> 01:38:18.000
You have to know what this means, to a degree.

01:38:18.000 --> 01:38:22.000
To be able to use it. Otherwise, you're just kind of going…

01:38:22.000 --> 01:38:24.000
Okay, I'll do what you recommend.

01:38:24.000 --> 01:38:30.000
So, for instance, I've been working with Codex to build this app, and it's currently 14,000 lines of Python.

01:38:30.000 --> 01:38:32.000
I don't know any Python.

01:38:32.000 --> 01:38:36.000
So, I am completely trusting it. The only way I can tell it…

01:38:36.000 --> 01:38:39.000
what's wrong, or interact, is by, did it work?

01:38:39.000 --> 01:38:40.000
Oh, it didn't? I noticed it did this. Fixed this. That's it.

01:38:40.000 --> 01:38:42.000
If that's your job, then you would already have that, hopefully.

01:38:42.000 --> 01:38:47.000
But, uh, the more you know, the more you can interact with this,

01:38:47.000 --> 01:38:49.000
And get better results.

01:38:49.000 --> 01:38:51.000
more closely to what you want.

01:38:51.000 --> 01:38:56.000
In fact, sometimes, I know Jan's has done the same thing. I've gone into the CSS code and said,

01:38:56.000 --> 01:39:01.000
No, no, no, let me edit the code, do it this way, because I didn't like what it was doing.

01:39:01.000 --> 01:39:03.000
So, A sounds good to me!

01:39:03.000 --> 01:39:06.000
Is everybody 8 sound good to everybody?

01:39:06.000 --> 01:39:07.000
Okay.

01:39:07.000 --> 01:39:15.000
At this point in time, if I was doing this for real, I'd say, well, yeah, let's drop the single file thing. That's obviously a constraint that's working against us.

01:39:15.000 --> 01:39:18.000
Let's make this multi-file.

01:39:18.000 --> 01:39:22.000
You know, or even say, you know, if we didn't have that constraint,

01:39:22.000 --> 01:39:25.000
What would the recommendations be and see what it says.

01:39:25.000 --> 01:39:27.000
I think I know what I would do.

01:39:27.000 --> 01:39:29.000
But…

01:39:29.000 --> 01:39:33.000
If you give it enough information, its recommendations are really…

01:39:33.000 --> 01:39:34.000
usually pretty good.

01:39:34.000 --> 01:39:35.000
Oh yeah.

01:39:35.000 --> 01:39:39.000
But if it doesn't have any information to work with, then…

01:39:39.000 --> 01:39:42.000
You know, it's just gonna kind of pick middle-of-the-road answers, really.

01:39:42.000 --> 01:39:49.000
Yep, the more…

01:39:49.000 --> 01:39:59.000
Go.

01:39:59.000 --> 01:40:10.000
Anybody else got questions?

01:40:10.000 --> 01:40:17.000
Alright.

01:40:17.000 --> 01:40:22.000
A project of this size, single file constraint isn't technically painful, but it does conflict with the CSS conventions.

01:40:22.000 --> 01:40:28.000
Separate files is better practice and costs nothing. I'd recommend dropping it.

01:40:28.000 --> 01:40:31.000
So…

01:40:31.000 --> 01:40:33.000
Uh…

01:40:33.000 --> 01:40:38.000
No server required, CSS via link, and JS with script source both work fine.

01:40:38.000 --> 01:40:42.000
The only thing that doesn't work is ES modules, due to browser core.

01:40:42.000 --> 01:40:45.000
So we still write plain non-module JS.

01:40:45.000 --> 01:40:49.000
Uh, what stays the same? Okay!

01:40:49.000 --> 01:40:53.000
Looks fine to me, what do you guys think? Gans, you wanna do it?

01:40:53.000 --> 01:40:54.000
Alright, let's do it.

01:40:54.000 --> 01:41:14.000
Sure.

01:41:14.000 --> 01:41:16.000
Alright, let's take a look at this.

01:41:16.000 --> 01:41:21.000
Okay, structure's now index.html-style CSS, script.js, no build step, no server.

01:41:21.000 --> 01:41:28.000
Works as a local file. Uh… option A, now with proper separation of concerns,

01:41:28.000 --> 01:41:32.000
Uh, design, Section 1, File Structure.

01:41:32.000 --> 01:41:35.000
index.html, no inline styles, no script logic.

01:41:35.000 --> 01:41:40.000
contains a header, main wrapping the carousel. The main is a div,

01:41:40.000 --> 01:41:45.000
with a class of carousel viewport containing a div class carousel track with 24 article…

01:41:45.000 --> 01:41:53.000
clock cards, generated by JS on load, 2 button arrow elements flanking the viewport for previous next.

01:41:53.000 --> 01:41:55.000
Link to Style CSS…

01:41:55.000 --> 01:41:58.000
Uh, using inline styles deferred.

01:41:58.000 --> 01:42:04.000
Each clock card contains an SVG analog face with hand elements.

01:42:04.000 --> 01:42:08.000
A footer with flag and city name, UTC offset and digital time readout.

01:42:08.000 --> 01:42:13.000
Home card gets a cloud clock card home.

01:42:13.000 --> 01:42:19.000
Does this structure look right? Looks fine to me.

01:42:19.000 --> 01:42:20.000
Yes.

01:42:20.000 --> 01:42:31.000
Go for it.

01:42:31.000 --> 01:42:36.000
Alright, let's look at the CSS.

01:42:36.000 --> 01:42:38.000
Design tokens?

01:42:38.000 --> 01:42:45.000
Looks fine. Carousel viewports with 100% overflow. See, a lot of this, you'd have to know CSS.

01:42:45.000 --> 01:42:50.000
Otherwise, again, you're gonna be like, okay!

01:42:50.000 --> 01:42:51.000
What I see looks fine.

01:42:51.000 --> 01:42:54.000
Mm-hmm. Okay.

01:42:54.000 --> 01:42:56.000
Dan, you got a problem with everything?

01:42:56.000 --> 01:43:00.000
Okay.

01:43:00.000 --> 01:43:01.000
Yes.

01:43:01.000 --> 01:43:08.000
That's fine. That's fine. I would probably fiddle with it a lot if I had if I was really doing this for real, you know, it would take me the evening and I would.

01:43:08.000 --> 01:43:10.000
take the care to…

01:43:10.000 --> 01:43:14.000
Do a bunch of little things.

01:43:14.000 --> 01:43:16.000
Okay.

01:43:16.000 --> 01:43:25.000
Here's our JS.

01:43:25.000 --> 01:43:29.000
Looks fine to me. Does anybody have a problem with it?

01:43:29.000 --> 01:43:32.000
Anybody? Okay.

01:43:32.000 --> 01:43:39.000
We're done with the HTML, we're done with CSS, we're done with JS.

01:43:39.000 --> 01:43:40.000
What do you mean by venues?

01:43:40.000 --> 01:43:41.000
Okay, we're now right to the…

01:43:41.000 --> 01:43:42.000
Go ahead.

01:43:42.000 --> 01:43:48.000
Yeah. We're done discussing that part as part of this brainstorming step.

01:43:48.000 --> 01:43:49.000
It hasn't written any of those things.

01:43:49.000 --> 01:43:54.000
Now it's writing the design doc, not the HTML, CSS, and JavaScript.

01:43:54.000 --> 01:43:56.000
Alright, so right now it's just writing specs.

01:43:56.000 --> 01:44:01.000
So they took our requirements overview, and it's writing up much more detailed specs.

01:44:01.000 --> 01:44:05.000
Based on our… our discussion.

01:44:05.000 --> 01:44:08.000
And as you can see here, there's now a docs directory.

01:44:08.000 --> 01:44:11.000
in the folder, and inside Docs,

01:44:11.000 --> 01:44:13.000
We've got superpowers,

01:44:13.000 --> 01:44:18.000
And… let's see what's in that… superpower…

01:44:18.000 --> 01:44:19.000
specs…

01:44:19.000 --> 01:44:20.000
Yeah.

01:44:20.000 --> 01:44:22.000
There's nothing in there right now, because it's still writing it.

01:44:22.000 --> 01:44:26.000
Yeah.

01:44:26.000 --> 01:44:35.000
So this is all part of that superpowers methodology stuff that I was talking about earlier.

01:44:35.000 --> 01:44:41.000
Okay.

01:44:41.000 --> 01:44:44.000
So again, I'll go over here,

01:44:44.000 --> 01:44:47.000
And now… there's the… there's the, uh…

01:44:47.000 --> 01:44:54.000
design document.

01:44:54.000 --> 01:44:58.000
Now it's right… now it's reading the file it just wrote.

01:44:58.000 --> 01:45:02.000
And said, oh, whoa, whoa, we got two ambiguities to fix.

01:45:02.000 --> 01:45:09.000
The init step says overwrites… override city with… whoa.

01:45:09.000 --> 01:45:11.000
Back here…

01:45:11.000 --> 01:45:12.000
Yep.

01:45:12.000 --> 01:45:14.000
It's still thinking. You can watch it as it goes by if you want.

01:45:14.000 --> 01:45:15.000
Okay?

01:45:15.000 --> 01:45:17.000
No, that's fine.

01:45:17.000 --> 01:45:20.000
Yeah, it's done. Okay, let's go back.

01:45:20.000 --> 01:45:26.000
So, there's two ambiguities. The init step says override city with users resolve city name, but doesn't say how.

01:45:26.000 --> 01:45:30.000
The city comes from parsing the INA string, okay?

01:45:30.000 --> 01:45:34.000
The carousel contains reference card width without specifying how it's measured.

01:45:34.000 --> 01:45:36.000
The offset parsing step,

01:45:36.000 --> 01:45:42.000
Doesn't say how to convert the short offset string to an integer. So…

01:45:42.000 --> 01:45:47.000
Right here, you see it's updating that designMD file.

01:45:47.000 --> 01:45:49.000
right here…

01:45:49.000 --> 01:45:52.000
And then it tells you how it's fixing it.

01:45:52.000 --> 01:45:57.000
So it said, you know, here are the problems, ah, here are the solutions.

01:45:57.000 --> 01:45:59.000
I'm coming up with…

01:45:59.000 --> 01:46:05.000
And then it added… it removed a change, this line to this.

01:46:05.000 --> 01:46:07.000
Okay?

01:46:07.000 --> 01:46:13.000
Specs written and committed, please review it, let me know if you want any changes before I write the implementation plan.

01:46:13.000 --> 01:46:17.000
My bet is there's gonna still be some holes, but that's fine.

01:46:17.000 --> 01:46:18.000
Yeah.

01:46:18.000 --> 01:46:19.000
You can work this stuff out.

01:46:19.000 --> 01:46:24.000
Right. I'll cap the file again…

01:46:24.000 --> 01:46:31.000
Okay, so let's look at this puppy.

01:46:31.000 --> 01:46:36.000
Okay, so this is everything it just wrote.

01:46:36.000 --> 01:46:40.000
Overview, file structure…

01:46:40.000 --> 01:46:43.000
Why it excluded ES modules?

01:46:43.000 --> 01:46:46.000
The color palette?

01:46:46.000 --> 01:46:51.000
Home Clock Highlight, and how it's gonna do that, the signals it's using.

01:46:51.000 --> 01:46:53.000
The clock faces?

01:46:53.000 --> 01:46:55.000
Day and night?

01:46:55.000 --> 01:47:02.000
Nighttime is defined as local hour less than 6, or local hour greater than 18. Jan's made it real simple in the document.

01:47:02.000 --> 01:47:04.000
He said it's 6 to 6.

01:47:04.000 --> 01:47:07.000
We're not gonna do anything super fancy.

01:47:07.000 --> 01:47:09.000
Clock face anatomy?

01:47:09.000 --> 01:47:12.000
Then, the HTML structure…

01:47:12.000 --> 01:47:15.000
Each generated clock card.

01:47:15.000 --> 01:47:21.000
Notice it's using semantic elements, that's part of the HTML CSS skill.

01:47:21.000 --> 01:47:24.000
Style sheet structure?

01:47:24.000 --> 01:47:26.000
Oh, it's gonna structure that…

01:47:26.000 --> 01:47:29.000
JavaScript…

01:47:29.000 --> 01:47:31.000
And then, this is what's gonna happen.

01:47:31.000 --> 01:47:35.000
When it runs…

01:47:35.000 --> 01:47:38.000
Carousel controls…

01:47:38.000 --> 01:47:42.000
And finally, there's the city… there's the data table.

01:47:42.000 --> 01:47:45.000
And these things are out of scope.

01:47:45.000 --> 01:47:46.000
Looks good to me!

01:47:46.000 --> 01:47:52.000
Fantastic. So it essentially just expanded on those requirements, but that's fine. That's exactly what you'd expect.

01:47:52.000 --> 01:47:57.000
Looks good.

01:47:57.000 --> 01:48:02.000
Alright, now it's gonna transition, now it's gonna start actually doing the work.

01:48:02.000 --> 01:48:13.000
It's not. It's going to now write an implementation plan based on those specs.

01:48:13.000 --> 01:48:19.000
What would happen if you just asked Claude, not Claude Code, to do that?

01:48:19.000 --> 01:48:22.000
Well, Claude, yeah, so…

01:48:22.000 --> 01:48:25.000
Are you talking about Claude Chat?

01:48:25.000 --> 01:48:26.000
Because there's lots of different…

01:48:26.000 --> 01:48:27.000
I believe that's what he means.

01:48:27.000 --> 01:48:33.000
There's lots of Claude services. You have the Claude chat, you have Claude Cowork, Claude Code.

01:48:33.000 --> 01:48:35.000
Claude API.

01:48:35.000 --> 01:48:41.000
All of those things that do different things. So just Claude Chat is all it does is it generates a text response.

01:48:41.000 --> 01:48:44.000
That's what chat is, it just generates a text response.

01:48:44.000 --> 01:48:45.000
like this.

01:48:45.000 --> 01:48:49.000
Um, actually, Claude Chat has tools that it can do. It can create documents and do some other things.

01:48:49.000 --> 01:48:55.000
some artifacts and whatnot, but it's a completely different harness, so that… I mentioned that harness.

01:48:55.000 --> 01:48:58.000
The Claude code has a whole bunch of methodology,

01:48:58.000 --> 01:49:03.000
And, uh, built into it. The superpowers is stuff that's on top of it.

01:49:03.000 --> 01:49:07.000
But if you would like to check that out, I would totally recommend.

01:49:07.000 --> 01:49:13.000
going to Claude Code site, you want to pull that up, Scott? Or just, you can do just, you know,

01:49:13.000 --> 01:49:18.000
It tells you how Claude code works, and it kind of covers that whole harness.

01:49:18.000 --> 01:49:19.000
You're talking about here?

01:49:19.000 --> 01:49:22.000
Yeah, it's gonna be in Claude Code Docs somewhere.

01:49:22.000 --> 01:49:27.000
Okay?

01:49:27.000 --> 01:49:31.000
Uh… where is it? Resources?

01:49:31.000 --> 01:49:36.000
Isn't that it?

01:49:36.000 --> 01:49:42.000
Where's the docks? What am I missing?

01:49:42.000 --> 01:49:43.000
Where am I supposed to go?

01:49:43.000 --> 01:49:48.000
Here, it was the second result in my Kagi search, Scott. Would you want me to paste it in here?

01:49:48.000 --> 01:49:51.000
And what is it? What do I search for?

01:49:51.000 --> 01:49:58.000
How Claude Code works.

01:49:58.000 --> 01:49:59.000
That one?

01:49:59.000 --> 01:50:02.000
That's it.

01:50:02.000 --> 01:50:03.000
So, this walks you through the whole thing.

01:50:03.000 --> 01:50:09.000
And it goes through in detail how Claude Code is different from Claude, and what it does.

01:50:09.000 --> 01:50:10.000
Right.

01:50:10.000 --> 01:50:13.000
And you can see it's quite a bit different from just the Claude chat.

01:50:13.000 --> 01:50:23.000
Yeah. So, this is the Claude chat. If you ask it about that, you're gonna end up with a bunch of individual files, maybe?

01:50:23.000 --> 01:50:24.000
No.

01:50:24.000 --> 01:50:26.000
It's certainly not going to be doing what we're doing. Now, if you download the Claude app…

01:50:26.000 --> 01:50:31.000
Right? It's got tabs up here, so this is the classic chat, like ChatGPT.

01:50:31.000 --> 01:50:36.000
Co-work is where you're going to ask it to work on an Excel file, or a PowerPoint.

01:50:36.000 --> 01:50:38.000
Or something like that.

01:50:38.000 --> 01:50:39.000
And then there's code, but really, you can use code here, but really, it's far, far, far better to do it in the terminal like we're doing.

01:50:39.000 --> 01:50:54.000
Right. Yeah, it adds a GUI on Claude code, but there's some things that are not available in this GUI that are available in the CLI.

01:50:54.000 --> 01:50:55.000
Yeah.

01:50:55.000 --> 01:51:01.000
that this is for people that are not comfortable with CLIs. This is for those vibe coders out there that are just like, whoa.

01:51:01.000 --> 01:51:05.000
Right.

01:51:05.000 --> 01:51:10.000
Alright. Uh, spec written in… well, where are we?

01:51:10.000 --> 01:51:14.000
We're still waiting. It's still thinking, isn't it?

01:51:14.000 --> 01:51:15.000
Still transitioning.

01:51:15.000 --> 01:51:22.000
Yeah. Three minutes and 25 seconds and counting.

01:51:22.000 --> 01:51:26.000
You just tried it,

01:51:26.000 --> 01:51:33.000
Well, currently 29, but it's been using a lot more. Now, can I type usage at any time, Janz?

01:51:33.000 --> 01:51:41.000
Um, no, you could fire up a tab and type and enter Claude and do it. You could, you could fire a new session up right here if you wanted.

01:51:41.000 --> 01:51:46.000
But I wouldn't interrupt its current…

01:51:46.000 --> 01:51:50.000
You're not already in, that's wild, I don't know why.

01:51:50.000 --> 01:51:53.000
I don't know. So, I could go into Claude right now and type usage.

01:51:53.000 --> 01:51:58.000
Sure, sure.

01:51:58.000 --> 01:52:03.000
There you go.

01:52:03.000 --> 01:52:09.000
So, at 11 PM, my usage will reset. I've used 68% so far.

01:52:09.000 --> 01:52:13.000
And 15% for my week.

01:52:13.000 --> 01:52:16.000
There's my stats, my cool little…

01:52:16.000 --> 01:52:19.000
Terminal graphics.

01:52:19.000 --> 01:52:23.000
There you go. Okay.

01:52:23.000 --> 01:52:26.000
Okay, still transitioning…

01:52:26.000 --> 01:52:33.000
So,

01:52:33.000 --> 01:52:35.000
Yeah, Craig, Vernon said,

01:52:35.000 --> 01:52:41.000
He used Claude AI and said, build a world clock with round face in Python,

01:52:41.000 --> 01:52:43.000
that has a dashboard.

01:52:43.000 --> 01:52:47.000
Right, but it's not designed for it, so it's not going to be as easy to work with.

01:52:47.000 --> 01:52:52.000
As we are here, because here, we're actually where we need to be doing this.

01:52:52.000 --> 01:52:59.000
And then once it finally spits out whatever it's doing here, you know, we're going to be able to take control and

01:52:59.000 --> 01:53:01.000
revise it, etc., etc, etc.

01:53:01.000 --> 01:53:07.000
No, no problem, Vernon. You can ask all you want, that's how you learn.

01:53:07.000 --> 01:53:12.000
Okay. Keynes, do we have any idea why it's taking so long to transition to the implementation plan?

01:53:12.000 --> 01:53:13.000
What's it doing right now?

01:53:13.000 --> 01:53:17.000
It's taking five minutes, man.

01:53:17.000 --> 01:53:19.000
I know that's 5 minutes!

01:53:19.000 --> 01:53:21.000
It's too long!

01:53:21.000 --> 01:53:29.000
If you were doing this, you'd be done by now with the whole thing, right?

01:53:29.000 --> 01:53:31.000
I mean by hand. We were doing it by hand.

01:53:31.000 --> 01:53:34.000
That's right.

01:53:34.000 --> 01:53:37.000
That is actually a good point. Yeah, we're sitting here waiting.

01:53:37.000 --> 01:53:38.000
But… this is saving massive amounts of time nonetheless.

01:53:38.000 --> 01:53:42.000
Thinking!

01:53:42.000 --> 01:53:45.000
That's when you flip up to one of the other projects.

01:53:45.000 --> 01:53:46.000
That's exactly correct.

01:53:46.000 --> 01:53:47.000
Right, exactly right.

01:53:47.000 --> 01:53:48.000
tabs. That's what CMEX is for, right?

01:53:48.000 --> 01:53:49.000
Exactly. Yeah.

01:53:49.000 --> 01:53:51.000
That's what Sebucks is for. I'm like, oh, it's doing okay. I'm going to go to this, do this.

01:53:51.000 --> 01:53:55.000
All right, what's… what's the other AI asking for now?

01:53:55.000 --> 01:53:57.000
Actually…

01:53:57.000 --> 01:53:58.000
Yep.

01:53:58.000 --> 01:54:01.000
Like, at this point, it's like, you know, the slowdown is it asking me questions.

01:54:01.000 --> 01:54:03.000
Right, and so…

01:54:03.000 --> 01:54:06.000
This is actually another one I'm running.

01:54:06.000 --> 01:54:09.000
Right now, for this Lovecraft… Lovecraft.

01:54:09.000 --> 01:54:13.000
clock, and so… meanwhile…

01:54:13.000 --> 01:54:17.000
Uh, why can't it be reached?

01:54:17.000 --> 01:54:18.000
Yeah, it probably died.

01:54:18.000 --> 01:54:20.000
I think it died.

01:54:20.000 --> 01:54:23.000
But…

01:54:23.000 --> 01:54:24.000
Okay, grab me something. Oh, don't grab me something.

01:54:24.000 --> 01:54:27.000
You can tell it.

01:54:27.000 --> 01:54:30.000
Tell him…

01:54:30.000 --> 01:54:33.000
to look up Dewey's Pizza.

01:54:33.000 --> 01:54:38.000
D-E-W-E-Y. Tell him to get a Dewey's Pizza, that's gonna be a lot better.

01:54:38.000 --> 01:54:41.000
My brother's visiting, folks.

01:54:41.000 --> 01:54:47.000
Yeah, it'd be better. Okay, so this is one where I'm having it take Lovecraft,

01:54:47.000 --> 01:54:53.000
creations, and make a… I have images for all of them, and I wanted to make a…

01:54:53.000 --> 01:54:56.000
a filterable, sortable interface for those.

01:54:56.000 --> 01:55:02.000
And so, it said, how do you want to organize these? And I said, well, you know, actually, why don't we make it available to

01:55:02.000 --> 01:55:05.000
to a filter on the alignment.

01:55:05.000 --> 01:55:09.000
Um, uh, type and environment.

01:55:09.000 --> 01:55:11.000
And so it came back, and said, okay.

01:55:11.000 --> 01:55:13.000
Um…

01:55:13.000 --> 01:55:15.000
You know, it's giving me this stuff here, and I can respond.

01:55:15.000 --> 01:55:17.000
The world clock's waiting for you, Scott.

01:55:17.000 --> 01:55:19.000
The world clock's done.

01:55:19.000 --> 01:55:23.000
Oh, there you go. Uh…

01:55:23.000 --> 01:55:24.000
Where?

01:55:24.000 --> 01:55:27.000
It's waiting for your input. I don't know.

01:55:27.000 --> 01:55:29.000
I don't see anything.

01:55:29.000 --> 01:55:31.000
I see.

01:55:31.000 --> 01:55:32.000
Just ask it what's up?

01:55:32.000 --> 01:55:33.000
Said close. Oop!

01:55:33.000 --> 01:55:46.000
Whoop! Here we go. Wrote 770 lines.

01:55:46.000 --> 01:55:48.000
Still thinking.

01:55:48.000 --> 01:55:55.000
Hmm. I don't know why your CMUX is saying Claude is waiting for your input. Mine does not do that.

01:55:55.000 --> 01:55:57.000
Well, I need to look at your settings.

01:55:57.000 --> 01:55:59.000
Clearly.

01:55:59.000 --> 01:56:00.000
I think, I think you need to send your laptop to the grinder and…

01:56:00.000 --> 01:56:03.000
Okay.

01:56:03.000 --> 01:56:07.000
Stop installing so many doohickeys on it.

01:56:07.000 --> 01:56:08.000
I love my doohickeys.

01:56:08.000 --> 01:56:10.000
I know.

01:56:10.000 --> 01:56:16.000
They don't love you.

01:56:16.000 --> 01:56:18.000
So you can tell it to restart the server.

01:56:18.000 --> 01:56:19.000
Right here.

01:56:19.000 --> 01:56:40.000
Yeah, you can say the URL's not working.

01:56:40.000 --> 01:56:43.000
Okay, let's go back… Oh! Here we go!

01:56:43.000 --> 01:56:46.000
Plan complete and save to…

01:56:46.000 --> 01:56:49.000
That document, 2 execution Options,

01:56:49.000 --> 01:56:53.000
Well, that's from the other one. So it's showing me, oh, how about these filters?

01:56:53.000 --> 01:56:56.000
So I could look at it, but let's go back to where we were.

01:56:56.000 --> 01:57:04.000
So, right here. Um, do you want it to be sub-agent driven? That's recommended. I dispatched a fresh sub-agent per task.

01:57:04.000 --> 01:57:07.000
review between tests, fast iteration, or…

01:57:07.000 --> 01:57:14.000
Execute task in this session, use executing plans with checkpoints. So let's go with sub-agents, I think that makes the sense, and it should get…

01:57:14.000 --> 01:57:18.000
Are you going to review the… review that document first, or…?

01:57:18.000 --> 01:57:19.000
You don't have.

01:57:19.000 --> 01:57:20.000
I really don't want to review 700 files. 700 lines.

01:57:20.000 --> 01:57:22.000
Okay. You don't have to.

01:57:22.000 --> 01:57:25.000
I know. I'm not going to.

01:57:25.000 --> 01:57:26.000
Okay?

01:57:26.000 --> 01:57:27.000
If you're, if you're doing this for real, I would recommend it.

01:57:27.000 --> 01:57:33.000
Oh, of course, of course, but I don't want us to be here till 10.

01:57:33.000 --> 01:57:39.000
Okay…

01:57:39.000 --> 01:57:43.000
So now it's firing off sub-agents.

01:57:43.000 --> 01:57:47.000
So it's, it's now acting as that orchestration agent.

01:57:47.000 --> 01:57:48.000
Yup.

01:57:48.000 --> 01:57:55.000
Those sub-agents will do different things. They'll have a coding agent, you'll have one that's checking… that's code review agent.

01:57:55.000 --> 01:57:57.000
etc.

01:57:57.000 --> 01:58:03.000
Now, it said I could type slash agents. Again, is that something I should not do now?

01:58:03.000 --> 01:58:06.000
Um, probably not right in the middle, not right here.

01:58:06.000 --> 01:58:09.000
Okay, so here are the 10 tasks.

01:58:09.000 --> 01:58:10.000
Okay? But it broke in.

01:58:10.000 --> 01:58:17.000
And it's gonna, it's gonna work through those and it might, it might work through, it might take 20 minutes to work through those, Scott. I doubt it for this size of the thing, but I've, I've had it.

01:58:17.000 --> 01:58:20.000
sit there and spin on big things that I'm doing.

01:58:20.000 --> 01:58:24.000
For long periods of time, and that's where you go to pop off to another tab, and you do your stuff there.

01:58:24.000 --> 01:58:25.000
Go for lunch.

01:58:25.000 --> 01:58:26.000
Or go eat.

01:58:26.000 --> 01:58:31.000
So there's my world clock so far.

01:58:31.000 --> 01:58:33.000
Looks great.

01:58:33.000 --> 01:58:40.000
I'm not worried.

01:58:40.000 --> 01:58:51.000
Well, I'm not doing much at all, it's doing it.

01:58:51.000 --> 01:58:55.000
Okay.

01:58:55.000 --> 01:58:58.000
Uh-oh, I've used 91% of my session limit jans, I told you I need to upgrade.

01:58:58.000 --> 01:59:03.000
Maybe there's the problem.

01:59:03.000 --> 01:59:04.000
I've now used 93%.

01:59:04.000 --> 01:59:06.000
You'd probably go update it right now and catch it before you get to Hunter. Go!

01:59:06.000 --> 01:59:22.000
Yup.

01:59:22.000 --> 01:59:23.000
It's a race for time!

01:59:23.000 --> 01:59:31.000
Are you really going to upgrade to the $100 a month one for this?

01:59:31.000 --> 01:59:33.000
Well, I was going to upgrade anyway.

01:59:33.000 --> 01:59:35.000
For what?

01:59:35.000 --> 01:59:37.000
For all the other stuff.

01:59:37.000 --> 01:59:38.000
Uh-huh. What other stuff?

01:59:38.000 --> 01:59:42.000
Okay, well, we'll just do it this month.

01:59:42.000 --> 01:59:50.000
Uh… link sent to Scott at Web Sanity, let me go…

01:59:50.000 --> 01:59:58.000
That should work.

01:59:58.000 --> 02:00:04.000
There we go.

02:00:04.000 --> 02:00:06.000
Sign in…

02:00:06.000 --> 02:00:10.000
Here's the code…

02:00:10.000 --> 02:00:29.000
Code in…

02:00:29.000 --> 02:00:31.000
So, I think it just opened up…

02:00:31.000 --> 02:00:34.000
Playwright.

02:00:34.000 --> 02:00:57.000
I'll get this off your guys' screen, you don't need to see this.

02:00:57.000 --> 02:01:07.000
Waiting. Let's see where we are.

02:01:07.000 --> 02:01:09.000
Okay, did the HTML…

02:01:09.000 --> 02:01:10.000
Now it's doing the CSS.

02:01:10.000 --> 02:01:18.000
Upgrade, upgrade!

02:01:18.000 --> 02:01:26.000
didn't it kind of already do that when it showed us the examples?

02:01:26.000 --> 02:01:27.000
Well, it didn't actually generate the files and do the work.

02:01:27.000 --> 02:01:31.000
I think… But it did kind of do it.

02:01:31.000 --> 02:01:37.000
Well, it… yeah, it had… it had pretty much all the HTML and CSS you needed to…

02:01:37.000 --> 02:01:45.000
Right.

02:01:45.000 --> 02:01:46.000
It's a rip-off, man.

02:01:46.000 --> 02:01:47.000
It's a display.

02:01:47.000 --> 02:01:49.000
Yes.

02:01:49.000 --> 02:01:56.000
It's… it's still… it's still kind of a dumb computer. It doesn't know, you know, that it's… that it's, uh, duplicating work sometimes.

02:01:56.000 --> 02:02:04.000
So, I actually… when I, uh, complete something, I, uh, have a command called retro that it…

02:02:04.000 --> 02:02:08.000
does a self-assessment and can learn things, and…

02:02:08.000 --> 02:02:09.000
Oh, really? What do you… is that built in?

02:02:09.000 --> 02:02:11.000
adjust for next time.

02:02:11.000 --> 02:02:14.000
No, it's, uh…

02:02:14.000 --> 02:02:20.000
There we go.

02:02:20.000 --> 02:02:23.000
Share it!

02:02:23.000 --> 02:02:29.000
Don't be selfish, Craig.

02:02:29.000 --> 02:02:33.000
Alright, back here.

02:02:33.000 --> 02:02:35.000
Okay, step 3…

02:02:35.000 --> 02:02:41.000
Now, that's interesting, it says Haiku45. What's going on with that, Gans?

02:02:41.000 --> 02:02:46.000
Um, it has dropped you down to a different agent so that… because you're running out of session.

02:02:46.000 --> 02:02:50.000
Yeah, but when I go here and type usage, I'm fine.

02:02:50.000 --> 02:02:51.000
So now what do I do?

02:02:51.000 --> 02:02:52.000
I think it's custom. I don't think I've published that one yet, though.

02:02:52.000 --> 02:02:56.000
I don't know.

02:02:56.000 --> 02:02:59.000
Uh, try to quit out of the session, and then, um…

02:02:59.000 --> 02:03:00.000
I wonder if I told it.

02:03:00.000 --> 02:03:03.000
go back in with the restore session.

02:03:03.000 --> 02:03:06.000
Yeah, I think you gotta have to quit and then…

02:03:06.000 --> 02:03:07.000
See what it says.

02:03:07.000 --> 02:03:08.000
go back in, but…

02:03:08.000 --> 02:03:10.000
Yeah, hit the escape key.

02:03:10.000 --> 02:03:13.000
Okay.

02:03:13.000 --> 02:03:16.000
Hit the escape key.

02:03:16.000 --> 02:03:20.000
Whoop, up, boop!

02:03:20.000 --> 02:03:23.000
I'll go ahead and plug it back in again.

02:03:23.000 --> 02:03:30.000
Just whack it on the side.

02:03:30.000 --> 02:03:39.000
It'll be… that'll be worth watching in Task 7. 3 is complete, now we're on to 4.

02:03:39.000 --> 02:03:40.000
Let's see…

02:03:40.000 --> 02:03:45.000
Control-Alt-Delete.

02:03:45.000 --> 02:03:54.000
I don't see Haiku here. We'll see what happens.

02:03:54.000 --> 02:03:57.000
Okay, let's go back here and take a look.

02:03:57.000 --> 02:04:04.000
Do we have anything? Nothing.

02:04:04.000 --> 02:04:07.000
Okay? Meanwhile…

02:04:07.000 --> 02:04:11.000
This part, notice it created this nice little web page.

02:04:11.000 --> 02:04:17.000
And said, okay. I told it to create filters for a HP Lovecraft characters on alignment,

02:04:17.000 --> 02:04:22.000
type and environment, and it says, I've got these four. Whoa!

02:04:22.000 --> 02:04:25.000
Okay.

02:04:25.000 --> 02:04:27.000
Our clock is coming together.

02:04:27.000 --> 02:04:31.000
Let's see if these buttons work. Nope, buttons don't work.

02:04:31.000 --> 02:04:33.000
Let's see what it's saying.

02:04:33.000 --> 02:04:40.000
Oh, this is looking good.

02:04:40.000 --> 02:04:44.000
Oh, right there. See the Git?

02:04:44.000 --> 02:04:45.000
It's using Git for us.

02:04:45.000 --> 02:04:47.000
Looking pretty good.

02:04:47.000 --> 02:04:49.000
In fact…

02:04:49.000 --> 02:04:55.000
Um…

02:04:55.000 --> 02:04:57.000
Nothing to commit, okay.

02:04:57.000 --> 02:05:04.000
Thought I would have committed by now.

02:05:04.000 --> 02:05:07.000
Okay, so, while we're waiting…

02:05:07.000 --> 02:05:13.000
Uh… does anybody have any questions or thoughts? Now's a perfect time, if anybody has any.

02:05:13.000 --> 02:05:16.000
While we're waiting.

02:05:16.000 --> 02:05:19.000
Anybody?

02:05:19.000 --> 02:05:28.000
This is Stan. Is anyone security auditing the code, the AI code behind the scenes?

02:05:28.000 --> 02:05:30.000
That is an excellent question.

02:05:30.000 --> 02:05:31.000
And I'll let Jans and Craig answer it.

02:05:31.000 --> 02:05:33.000
We just don't know.

02:05:33.000 --> 02:05:34.000
And then I have something to ask. I have something to point out.

02:05:34.000 --> 02:05:38.000
What code are we talking about? What code are we talking about?

02:05:38.000 --> 02:05:39.000
I think Stan's saying, who's auditing this stuff for security?

02:05:39.000 --> 02:05:43.000
Oh.

02:05:43.000 --> 02:05:47.000
What stuff are we talking about?

02:05:47.000 --> 02:05:48.000
Well, if you didn't ask it to audit, it did not get audited, is it?

02:05:48.000 --> 02:05:49.000
Right. All of it!

02:05:49.000 --> 02:05:51.000
Well, it depends on the it depends on the part that you're.

02:05:51.000 --> 02:05:54.000
Like, if you ask it to,

02:05:54.000 --> 02:06:00.000
I'm sorry. 2 people talking. I couldn't make out.

02:06:00.000 --> 02:06:01.000
If… if you ask it to audit it, it will audit it. If you don't, it probably won't.

02:06:01.000 --> 02:06:05.000
Craig, what'd you say?

02:06:05.000 --> 02:06:07.000
So again, you have to know what you're doing,

02:06:07.000 --> 02:06:13.000
Like, to know it should be audited, right? So, anytime I write software,

02:06:13.000 --> 02:06:14.000
Okay?

02:06:14.000 --> 02:06:16.000
Okay, wait, wait. Wait! Wait, wait. Who's auditing the tools? What I should say?

02:06:16.000 --> 02:06:21.000
That's what I was wondering, sir, are you talking about the code it's generating, or the code?

02:06:21.000 --> 02:06:24.000
The tools. Who's auditing the tools? Great.

02:06:24.000 --> 02:06:29.000
Uh, Claude is auditing these tools, and I mean… I mean by that, clawed code is auditing

02:06:29.000 --> 02:06:38.000
itself, in fact, what did they say at Anthropic? Virtually all of Claude Code is written by Claude Code at this point.

02:06:38.000 --> 02:06:39.000
So, that's what's doing it.

02:06:39.000 --> 02:06:40.000
Okay. Okay.

02:06:40.000 --> 02:06:46.000
But like Craig is saying, if you want to audit this code that's being generated, you have to tell it to audit it.

02:06:46.000 --> 02:06:49.000
Now, that could be something you include.

02:06:49.000 --> 02:06:50.000
You know?

02:06:50.000 --> 02:06:51.000
Okay.

02:06:51.000 --> 02:06:56.000
Right. So yeah, the people at Anthropic are using Claude to audit…

02:06:56.000 --> 02:07:00.000
Claude's code.

02:07:00.000 --> 02:07:01.000
Right. And this just happened, I believe, yesterday?

02:07:01.000 --> 02:07:03.000
Okay.

02:07:03.000 --> 02:07:09.000
Uh, Claude has a new model set to release, and they are calling it…

02:07:09.000 --> 02:07:12.000
Does anybody remember? Mythos!

02:07:12.000 --> 02:07:13.000
Right, and they are not releasing it to the public.

02:07:13.000 --> 02:07:14.000
Mythos.

02:07:14.000 --> 02:07:15.000
Mythos

02:07:15.000 --> 02:07:26.000
Because, apparently, it is so good at finding security vulnerabilities, they ran it against all the major web browsers, and I believe, correct me if I'm wrong, somebody,

02:07:26.000 --> 02:07:29.000
It found hundreds of security vulnerabilities.

02:07:29.000 --> 02:07:30.000
It found security vulnerability that had been around 20 years in OpenBSD.

02:07:30.000 --> 02:07:35.000
Yeah, I've heard.

02:07:35.000 --> 02:07:39.000
Which, if I recall, is famous for not having security vulnerabilities.

02:07:39.000 --> 02:07:44.000
So…

02:07:44.000 --> 02:07:45.000
Yes, and so what they…

02:07:45.000 --> 02:07:47.000
I've heard between 700 and 1,000 different vulnerabilities.

02:07:47.000 --> 02:07:48.000
What's that?

02:07:48.000 --> 02:07:50.000
Which was the best? Which was the best.

02:07:50.000 --> 02:07:56.000
Which was the best. I don't understand what that means.

02:07:56.000 --> 02:07:57.000
That was not released, because…

02:07:57.000 --> 02:07:58.000
It just didn't…

02:07:58.000 --> 02:07:59.000
Well, the browsers and all the security problems. Oh, that's all right.

02:07:59.000 --> 02:08:04.000
They are… hold on, because they have gone to Apple

02:08:04.000 --> 02:08:11.000
Google, uh, IBM, Microsoft, and they're giving them the results and allowing them to use it.

02:08:11.000 --> 02:08:17.000
Because they've said, we are not going to allow the public to have access to this, because this could conceivably be

02:08:17.000 --> 02:08:21.000
Such a powerful tool in the wrong hands.

02:08:21.000 --> 02:08:23.000
We are going to give these companies

02:08:23.000 --> 02:08:27.000
heads-up notice, so they can fix the problems.

02:08:27.000 --> 02:08:28.000
So…

02:08:28.000 --> 02:08:33.000
You know, Scott, something that you and I haven't talked about, and I'd be really interested to hear your.

02:08:33.000 --> 02:08:41.000
Your take on it and your opinion about Anthropics being a CPG.

02:08:41.000 --> 02:08:42.000
You mean… what?

02:08:42.000 --> 02:08:44.000
The Corporation for the Public Good.

02:08:44.000 --> 02:08:46.000
Public good? Okay.

02:08:46.000 --> 02:08:52.000
Um, well, there's a couple of ways to think about that. So, as most of you guys probably know from the news,

02:08:52.000 --> 02:08:55.000
Right? Uh, the, um…

02:08:55.000 --> 02:08:59.000
Department of Defense, I'm not going to call it the Department of War, because the name hasn't changed,

02:08:59.000 --> 02:09:06.000
The Department of Defense said to Anthropic, hey, we basically want to use your tool for anything we want.

02:09:06.000 --> 02:09:13.000
And Anthropic said, no, we don't… we're not going to give you the right to do whatever you want. We don't allow you to use it.

02:09:13.000 --> 02:09:19.000
to… for autonomous robots, right? I think that was one of the things, or…

02:09:19.000 --> 02:09:21.000
Human targeting? Was that it?

02:09:21.000 --> 02:09:28.000
And as a result, the Defense Department and Trump said, oh, we're not going to have you anymore, and we're going to declare you

02:09:28.000 --> 02:09:32.000
a tool that nobody in the Defense Department can use for any reason.

02:09:32.000 --> 02:09:33.000
And about a day later, OpenAI goes, oh, we'll do it!

02:09:33.000 --> 02:09:36.000
Yeah.

02:09:36.000 --> 02:09:42.000
We'll sign up, and then Sam Altman came out and said, they've agreed that they won't use it.

02:09:42.000 --> 02:09:47.000
for these things, but of course, it's not in writing. They just basically said, we won't do it.

02:09:47.000 --> 02:09:54.000
Trust us. And so, that caused a lot of people to switch to Anthropic. I know two people, at least,

02:09:54.000 --> 02:10:00.000
that were using ChatGPT and switched to Claude, because they felt like, oh, these guys are more ethical.

02:10:00.000 --> 02:10:04.000
than ChatGPT. Um, so I think it's helped them.

02:10:04.000 --> 02:10:08.000
Uh… that said, they are a company.

02:10:08.000 --> 02:10:10.000
Um, so, you know…

02:10:10.000 --> 02:10:13.000
companies do all sorts of things all the time that are problematic.

02:10:13.000 --> 02:10:20.000
I don't think they're angels. I do think, overall, they take AI safety a lot more importantly.

02:10:20.000 --> 02:10:23.000
than OpenAI, for instance.

02:10:23.000 --> 02:10:25.000
or Google, um…

02:10:25.000 --> 02:10:26.000
So, I think they're…

02:10:26.000 --> 02:10:31.000
What do you think about the concept of a CPG in general?

02:10:31.000 --> 02:10:32.000
I think they're generally okay. I haven't really looked into it a lot.

02:10:32.000 --> 02:10:35.000
What?

02:10:35.000 --> 02:10:38.000
It sounds to me like you're asking me a loaded question. Go ahead.

02:10:38.000 --> 02:10:42.000
No, I'm not. I'm curious, I'm…

02:10:42.000 --> 02:10:46.000
generally curious on what you think. I mean, you've looked at this stuff a lot more than I have.

02:10:46.000 --> 02:10:48.000
Okay.

02:10:48.000 --> 02:10:49.000
Right.

02:10:49.000 --> 02:10:50.000
I think it's okay. I haven't looked at it in huge, huge…

02:10:50.000 --> 02:10:51.000
Detailed.

02:10:51.000 --> 02:10:52.000
What? What? What is a corporate for public good?

02:10:52.000 --> 02:10:54.000
So, it means that they've added

02:10:54.000 --> 02:10:59.000
Extra legal handcuffs to themselves willingly.

02:10:59.000 --> 02:11:03.000
saying that… that the public good.

02:11:03.000 --> 02:11:06.000
outranks their shareholders.

02:11:06.000 --> 02:11:13.000
the shareholder, good. And that they can be sued and prosecuted for… if it's proven that they've made choices,

02:11:13.000 --> 02:11:17.000
that prioritize their shareholders and their own company over the public good.

02:11:17.000 --> 02:11:18.000
Yeah.

02:11:18.000 --> 02:11:23.000
Now, there's… it's a weak mechanism to enforce that, but it's still an enforceable.

02:11:23.000 --> 02:11:26.000
Um, legal…

02:11:26.000 --> 02:11:30.000
Structure.

02:11:30.000 --> 02:11:31.000
Yeah, except it's in your charter.

02:11:31.000 --> 02:11:33.000
Oh, you mean I do no evil, right?

02:11:33.000 --> 02:11:38.000
It's in your, you know, right, not just a… not just a, uh…

02:11:38.000 --> 02:11:39.000
Logan.

02:11:39.000 --> 02:11:43.000
Right. And it has at least some teeth. There are legal mechanisms to enforce it.

02:11:43.000 --> 02:11:49.000
It's not just saying, yes, we are going to do this. It's not an aspiration. There's actual legal mechanisms to enforce

02:11:49.000 --> 02:11:56.000
that they comply. Now, there's not… there's not anybody to… that's watching them, particularly, you know, there's not a CPG police out there.

02:11:56.000 --> 02:11:58.000
Yes. But.

02:11:58.000 --> 02:12:02.000
And usually when CPGs get sued, they get sued by.

02:12:02.000 --> 02:12:05.000
You know, a consumer group or shareholders themselves, or…

02:12:05.000 --> 02:12:06.000
you know, whatever the case may be.

02:12:06.000 --> 02:12:09.000
Yeah. At least it's a pathway that's legal. Right.

02:12:09.000 --> 02:12:17.000
Yeah, yeah, exactly. And, and it does, you know, put your money where your mouth is to some degree when they say, hey, you know, this is our intent.

02:12:17.000 --> 02:12:22.000
with this company, and so far, Anthropic has, you know, walked that walk.

02:12:22.000 --> 02:12:25.000
Which I feel much more comfortable about than many of the other ones.

02:12:25.000 --> 02:12:27.000
Google.

02:12:27.000 --> 02:12:30.000
OpenAI, Meta.

02:12:30.000 --> 02:12:32.000
Certainly meta, certainly Grok, and…

02:12:32.000 --> 02:12:36.000
shitter, you know.

02:12:36.000 --> 02:12:37.000
Yeah.

02:12:37.000 --> 02:12:38.000
Hmm.

02:12:38.000 --> 02:12:42.000
I mean, Twitter, I mean, XAI Grok,

02:12:42.000 --> 02:12:45.000
is explicitly doing…

02:12:45.000 --> 02:12:49.000
Horrible things, and Musk is just laughing hysterically about it.

02:12:49.000 --> 02:12:55.000
Um, you know, when I said child sexual abuse material, I was not exaggerating.

02:12:55.000 --> 02:13:02.000
Uh, it allowed people to upload a picture of somebody, and then nudify the image, and so all these young women…

02:13:02.000 --> 02:13:12.000
Uh, people were uploading pictures of them right and left, and saying, uh, show me what this person would look like nude, and it was doing it, and they were publicly available.

02:13:12.000 --> 02:13:21.000
So, um, horrible. That's absolutely… there's no excuse, it's reprehensible, and Musk was, like, writing, you know, dismissive things on Twitter, and…

02:13:21.000 --> 02:13:25.000
And, you know, laughing about it and so on, and uh…

02:13:25.000 --> 02:13:29.000
It's absolutely horrible. It's gotten banned in several countries as a result of it.

02:13:29.000 --> 02:13:32.000
And of course, Apple and Google did nothing.

02:13:32.000 --> 02:13:40.000
They should have banned the app from the App Store, but of course they did nothing.

02:13:40.000 --> 02:13:42.000
Okay, we are at task 7.

02:13:42.000 --> 02:13:45.000
The home clock?

02:13:45.000 --> 02:13:47.000
That's what we have so far.

02:13:47.000 --> 02:13:51.000
We are not in Honolulu.

02:13:51.000 --> 02:13:53.000
But our clocks are…

02:13:53.000 --> 02:13:55.000
Yeah, it's doing a pretty good job so far.

02:13:55.000 --> 02:13:57.000
Yeah?

02:13:57.000 --> 02:14:02.000
Question. This is Stan. Had you heard with the mythos that they're.

02:14:02.000 --> 02:14:08.000
They're giving this tools or the the output to the the open source community.

02:14:08.000 --> 02:14:10.000
I believe they're giving… oh, no.

02:14:10.000 --> 02:14:12.000
I don't think so, I don't know. Uh, let's look…

02:14:12.000 --> 02:14:17.000
I mean, or at least the results is they found something, and you said like.

02:14:17.000 --> 02:14:20.000
Well, I know that they said they had found something,

02:14:20.000 --> 02:14:23.000
in OpenBSD.

02:14:23.000 --> 02:14:24.000
Uh, it was just the last couple of days, let me look…

02:14:24.000 --> 02:14:26.000
Yeah.

02:14:26.000 --> 02:14:27.000
Craig bubbled us up really early in the night.

02:14:27.000 --> 02:14:31.000
Glass swing! Look up Glass Wing.

02:14:31.000 --> 02:14:32.000
Glass wing, here at ours.

02:14:32.000 --> 02:14:35.000
Yeah. Um…

02:14:35.000 --> 02:14:36.000
I don't know.

02:14:36.000 --> 02:14:37.000
Glass wing?

02:14:37.000 --> 02:14:38.000
Ring.

02:14:38.000 --> 02:14:39.000
Yeah.

02:14:39.000 --> 02:14:40.000
The wind. Wing?

02:14:40.000 --> 02:14:44.000
wing, might be one word.

02:14:44.000 --> 02:14:48.000
That's not it. Um, let me just do a regular search for it.

02:14:48.000 --> 02:14:55.000
Outside of ours.

02:14:55.000 --> 02:15:06.000
Let's sort by… date.

02:15:06.000 --> 02:15:07.000
Should be, like, 2 days ago?

02:15:07.000 --> 02:15:09.000
It probably hasn't covered you yet, then.

02:15:09.000 --> 02:15:12.000
No, no, I thought I read the article…

02:15:12.000 --> 02:15:15.000
in…

02:15:15.000 --> 02:15:16.000
Huh.

02:15:16.000 --> 02:15:20.000
He probably read it on Anthropics' site.

02:15:20.000 --> 02:15:27.000
I was in the news.

02:15:27.000 --> 02:15:30.000
There's the article.

02:15:30.000 --> 02:15:43.000
At Anthropic.

02:15:43.000 --> 02:15:44.000
I was looking for a news article about it, like, where I've sort of seen the news article in place, I would have…

02:15:44.000 --> 02:15:49.000
out of the body.

02:15:49.000 --> 02:15:52.000
It was wired.

02:15:52.000 --> 02:15:55.000
Right.

02:15:55.000 --> 02:15:56.000
The company announced…

02:15:56.000 --> 02:15:58.000
Basically, any… any…

02:15:58.000 --> 02:16:00.000
you know, internet infrastructure,

02:16:00.000 --> 02:16:03.000
If they just released this thing, it would be…

02:16:03.000 --> 02:16:04.000
disasters.

02:16:04.000 --> 02:16:07.000
everyone's… yeah, it would be a disaster security-wise, so…

02:16:07.000 --> 02:16:10.000
They're giving it to the people that can…

02:16:10.000 --> 02:16:14.000
fix that. So, you know, Linux Foundation, Cisco,

02:16:14.000 --> 02:16:15.000
You know, saw the routers and stuff.

02:16:15.000 --> 02:16:16.000
There you go.

02:16:16.000 --> 02:16:22.000
The idea, in part, is to simply to give developers the world's foundational tech platforms time

02:16:22.000 --> 02:16:29.000
to turn Mythos Preview on their own systems so they can mitigate vulnerabilities and exploit chains that the model develops

02:16:29.000 --> 02:16:32.000
in simulated attacks.

02:16:32.000 --> 02:16:37.000
Yup. To kickstart urgent exploration of how AI capabilities across the industry…

02:16:37.000 --> 02:16:49.000
are on the precipice of upending current software security and digital defense practice. I believe Bruce Schneier wrote about this, too.

02:16:49.000 --> 02:16:51.000
Let's see… so…

02:16:51.000 --> 02:16:57.000
Microsoft, Apple, Google, Amazon, the Linux Foundation, Cisco, Nvidia,

02:16:57.000 --> 02:17:03.000
Broadcom and more than 40 other tech cybersecurity, critical infrastructure, and financial organizations

02:17:03.000 --> 02:17:07.000
That will have private access to the model.

02:17:07.000 --> 02:17:10.000
So, this is kind of a BFD, you know?

02:17:10.000 --> 02:17:14.000
Wait a minute. What was the, uh…

02:17:14.000 --> 02:17:15.000
Why do finance companies need that?

02:17:15.000 --> 02:17:17.000
Big fucking deal. What's that?

02:17:17.000 --> 02:17:18.000
Why do finance companies need that?

02:17:18.000 --> 02:17:21.000
PST.

02:17:21.000 --> 02:17:26.000
You don't think it'd be good for banks to know in finance companies to have access to this?

02:17:26.000 --> 02:17:31.000
I only saw one mention in the release, and it was, I think, Wells Fargo.

02:17:31.000 --> 02:17:32.000
But…

02:17:32.000 --> 02:17:36.000
Expedia, so they can leak it to the world.

02:17:36.000 --> 02:17:39.000
Yeah. So anyway, this happened…

02:17:39.000 --> 02:17:41.000
This was yesterday.

02:17:41.000 --> 02:17:46.000
So, I would definitely keep up with it. There you go, Project Glasswing.

02:17:46.000 --> 02:17:49.000
I would definitely keep up with that.

02:17:49.000 --> 02:17:52.000
Because it's going to be in the news for a while.

02:17:52.000 --> 02:17:56.000
Alright, we are… we are at task 10.

02:17:56.000 --> 02:18:03.000
It is doing final verification.

02:18:03.000 --> 02:18:08.000
I'm sorry, what? Something weird happened.

02:18:08.000 --> 02:18:11.000
Was that Gary?

02:18:11.000 --> 02:18:15.000
sounded like it.

02:18:15.000 --> 02:18:18.000
Now, that's kind of weird.

02:18:18.000 --> 02:18:23.000
Yeah, go ahead. Yeah, the what was the name of that project you just mentioned, Project Glass Link?

02:18:23.000 --> 02:18:29.000
Glass Wing, all one word, G-L-A-S-S, wing, W-I-N-G.

02:18:29.000 --> 02:18:32.000
Thank you.

02:18:32.000 --> 02:18:38.000
Okay, Craig, you said you published your skill.

02:18:38.000 --> 02:18:39.000
Yep, yep.

02:18:39.000 --> 02:18:42.000
Right? How do we access it? How do I use it?

02:18:42.000 --> 02:18:46.000
It probably made a README for you there?

02:18:46.000 --> 02:18:48.000
If you go to my repo, I think?

02:18:48.000 --> 02:18:50.000
Alright, let's go there.

02:18:50.000 --> 02:18:54.000
The AI skills…

02:18:54.000 --> 02:18:56.000
All right, Booch Tech just published AI Skills.

02:18:56.000 --> 02:18:57.000
Oh, yep. Installation.

02:18:57.000 --> 02:19:00.000
And there's no README, jerk.

02:19:00.000 --> 02:19:05.000
What? Oh, go up to the top level. I put you in a…

02:19:05.000 --> 02:19:07.000
Subfolder.

02:19:07.000 --> 02:19:08.000
There you go, right there.

02:19:08.000 --> 02:19:09.000
All right, so there you go.

02:19:09.000 --> 02:19:11.000
Copy-paste.

02:19:11.000 --> 02:19:13.000
Let's do it!

02:19:13.000 --> 02:19:19.000
Let's add Craig.

02:19:19.000 --> 02:19:21.000
Alright, let's go in…

02:19:21.000 --> 02:19:28.000
Go here. By the way, notice my usage is all of a sudden a lot better.

02:19:28.000 --> 02:19:31.000
Alright, uh…

02:19:31.000 --> 02:19:35.000
plug-ins. Oops, slash.

02:19:35.000 --> 02:19:37.000
Marketplace…

02:19:37.000 --> 02:19:40.000
at Marketplace…

02:19:40.000 --> 02:19:43.000
Oh, I just had to do that. I'm a dumbass, sorry.

02:19:43.000 --> 02:19:46.000
Whoop!

02:19:46.000 --> 02:19:50.000
Let me go back.

02:19:50.000 --> 02:19:56.000
Just do it right here, like that.

02:19:56.000 --> 02:20:01.000
Alright, and then we need to…

02:20:01.000 --> 02:20:10.000
Plug in, install, Booch Tech.

02:20:10.000 --> 02:20:13.000
Okay.

02:20:13.000 --> 02:20:18.000
Can I trust you, Craig?

02:20:18.000 --> 02:20:19.000
Okay.

02:20:19.000 --> 02:20:22.000
I think so.

02:20:22.000 --> 02:20:24.000
Should I go ahead and do project scope?

02:20:24.000 --> 02:20:26.000
No, do user scope.

02:20:26.000 --> 02:20:30.000
Yeah, you probably want that stuff for later.

02:20:30.000 --> 02:20:31.000
There we go.

02:20:31.000 --> 02:20:35.000
Okay, so where are we? The projects are Scott?

02:20:35.000 --> 02:20:39.000
You could… you could run slash retro now.

02:20:39.000 --> 02:20:41.000
Although you might have to.

02:20:41.000 --> 02:20:42.000
I don't know if it'll work in other instances yet.

02:20:42.000 --> 02:20:45.000
No, you'd have to restart the session.

02:20:45.000 --> 02:20:46.000
Yeah.

02:20:46.000 --> 02:20:49.000
But, as you saw, Scott, there's a problem.

02:20:49.000 --> 02:20:52.000
Oh, there's a big problem. Yeah.

02:20:52.000 --> 02:20:54.000
How would you describe the problem to it?

02:20:54.000 --> 02:20:56.000
I would say that…

02:20:56.000 --> 02:21:02.000
uh… Chicago is repeated twice, one day, one night.

02:21:02.000 --> 02:21:06.000
There appears to be an extra one for the day next to the correct one for night.

02:21:06.000 --> 02:21:08.000
Why do you think that that's happening?

02:21:08.000 --> 02:21:11.000
I have no idea.

02:21:11.000 --> 02:21:14.000
So…

02:21:14.000 --> 02:21:15.000
The time is actually completely correct. It's 8.52.

02:21:15.000 --> 02:21:17.000
One is the current time?

02:21:17.000 --> 02:21:25.000
But not, not, it's having a collision, Scott, so the, remember the style for your time has that orange on it?

02:21:25.000 --> 02:21:27.000
And the house?

02:21:27.000 --> 02:21:28.000
Okay.

02:21:28.000 --> 02:21:33.000
So it's… it's centering the Chicago time, which happens to be your time,

02:21:33.000 --> 02:21:36.000
And so you're saying, so you, you need to let it know what's going on.

02:21:36.000 --> 02:21:43.000
Say, hey, I'm, I'm in the Chicago zone, check this out in Playwright and tell me what you think is going wrong.

02:21:43.000 --> 02:21:48.000
And it can probably do a better job of describing what's going on than I'm doing right now.

02:21:48.000 --> 02:21:51.000
But it's doing the…

02:21:51.000 --> 02:21:54.000
your zone, in addition to all…

02:21:54.000 --> 02:22:00.000
All 24 zones, when it should be replacing the one for your zone.

02:22:00.000 --> 02:22:02.000
And the time on your zone is wrong.

02:22:02.000 --> 02:22:03.000
Yep.

02:22:03.000 --> 02:22:07.000
Okay?

02:22:07.000 --> 02:22:09.000
vibing.

02:22:09.000 --> 02:22:13.000
Again, those change constantly. Janz, what were some of the ones you saw?

02:22:13.000 --> 02:22:19.000
Craig sent the… posted where all of them are.

02:22:19.000 --> 02:22:21.000
The whole list.

02:22:21.000 --> 02:22:25.000
In the chat.

02:22:25.000 --> 02:22:26.000
I do not see where that is.

02:22:26.000 --> 02:22:36.000
7.52 PM Central.

02:22:36.000 --> 02:22:47.000
This is Stan. We're getting close to 9 o'clock. But considering how many people are here and the great interest, we can go for an extended period. So it's not a problem. Okay, Scott and Jans.

02:22:47.000 --> 02:22:51.000
I would like to… I would like to finish it.

02:22:51.000 --> 02:22:57.000
I know, I know, but… Don't worry about the time. It's whenever you guys are ready to stop.

02:22:57.000 --> 02:23:02.000
Well, A, we should worry about the time that it shows Chicago's time wrong.

02:23:02.000 --> 02:23:04.000
So that's when it opened.

02:23:04.000 --> 02:23:06.000
Which obviously is showing…

02:23:06.000 --> 02:23:08.000
I think that's what it opened?

02:23:08.000 --> 02:23:10.000
In a browser, and that doesn't look…

02:23:10.000 --> 02:23:11.000
And softwares…

02:23:11.000 --> 02:23:14.000
That's right. I don't think so, Scott.

02:23:14.000 --> 02:23:15.000
What?

02:23:15.000 --> 02:23:17.000
I don't think so.

02:23:17.000 --> 02:23:19.000
Oh, you don't think so? You think that's old? Okay.

02:23:19.000 --> 02:23:22.000
See what it said?

02:23:22.000 --> 02:23:25.000
What'd it say? What is… what is it?

02:23:25.000 --> 02:23:29.000
Your session. It said, I can't… I can't access file.

02:23:29.000 --> 02:23:33.000
And that I have to fire up a server.

02:23:33.000 --> 02:23:40.000
File is blocked by PlayRoute, I gotta spin up the local server.

02:23:40.000 --> 02:23:49.000
And oftentimes, if I want to see if it's going to spot something, I'll ask it a question. I don't think you asked it a question.

02:23:49.000 --> 02:23:58.000
Okay.

02:23:58.000 --> 02:23:59.000
So it found something.

02:23:59.000 --> 02:24:00.000
Can you tell me what, you know, take a look. I see something wrong, can you see it?

02:24:00.000 --> 02:24:06.000
There you go. There's one thing, DST mismatch.

02:24:06.000 --> 02:24:08.000
Duplicate data date?

02:24:08.000 --> 02:24:11.000
Data IANA collision.

02:24:11.000 --> 02:24:16.000
There are now two cards with America Chicago.

02:24:16.000 --> 02:24:17.000
So it found both problems.

02:24:17.000 --> 02:24:21.000
Right.

02:24:21.000 --> 02:24:24.000
Notice again, I did not say what the problems were.

02:24:24.000 --> 02:24:28.000
I simply said, I'm in the Chicago zone, check this in Playwright,

02:24:28.000 --> 02:24:33.000
to figure out what's wrong.

02:24:33.000 --> 02:24:38.000
It's all what's wrong, and is now fixing it.

02:24:38.000 --> 02:24:41.000
Oh, see the compacting right here?

02:24:41.000 --> 02:24:43.000
That's what we were talking about earlier.

02:24:43.000 --> 02:24:58.000
It said, okay, I've got a bunch of stuff here, I need to compact this.

02:24:58.000 --> 02:25:07.000
Happens here…

02:25:07.000 --> 02:25:09.000
It didn't notice, though, that the…

02:25:09.000 --> 02:25:13.000
5-hour offset clock isn't updating.

02:25:13.000 --> 02:25:15.000
And the others are, right?

02:25:15.000 --> 02:25:17.000
This one? Yeah.

02:25:17.000 --> 02:25:18.000
Yeah.

02:25:18.000 --> 02:25:20.000
But I think that's gonna disappear.

02:25:20.000 --> 02:25:23.000
Oh, yeah, yeah, I know. But, I mean…

02:25:23.000 --> 02:25:24.000
It didn't notice that issue.

02:25:24.000 --> 02:25:25.000
True.

02:25:25.000 --> 02:25:30.000
Now, if I was doing this for real, Scott, I probably would have had it use web components for all those little clock things.

02:25:30.000 --> 02:25:31.000
Right.

02:25:31.000 --> 02:25:34.000
Now, we've also probably done it in VEET so that I had a…

02:25:34.000 --> 02:25:38.000
Had a local dev environment with the real server, and…

02:25:38.000 --> 02:25:44.000
You can see, you know,

02:25:44.000 --> 02:25:45.000
Sure.

02:25:45.000 --> 02:25:47.000
HMR, the hot module reload stuff going on.

02:25:47.000 --> 02:25:49.000
Go in and tweak the CSS.

02:25:49.000 --> 02:25:53.000
And you might want to open this project in the…

02:25:53.000 --> 02:25:54.000
It's a really… let's do it.

02:25:54.000 --> 02:25:55.000
VS Code and take a look at what it did. Check out its CSS, check out its HTML.

02:25:55.000 --> 02:25:56.000
Let's do it.

02:25:56.000 --> 02:25:59.000
Take a look at his, look at its JavaScript.

02:25:59.000 --> 02:26:03.000
Let's open the folder…

02:26:03.000 --> 02:26:07.000
And let's go to Developer…

02:26:07.000 --> 02:26:10.000
World Cluck…

02:26:10.000 --> 02:26:15.000
Alright. So, here's what it did.

02:26:15.000 --> 02:26:17.000
Here's the files.

02:26:17.000 --> 02:26:19.000
Okay.

02:26:19.000 --> 02:26:22.000
That's the stuff superpowers was doing.

02:26:22.000 --> 02:26:24.000
Alright, what do you want me to open, Janz?

02:26:24.000 --> 02:26:25.000
index.html?

02:26:25.000 --> 02:26:27.000
What did I say? No, I said the CSS first, but yes, HTML and CSS.

02:26:27.000 --> 02:26:34.000
Oh. There's the CSS.

02:26:34.000 --> 02:26:38.000
I'll close that so we can see it more easily. So, we created…

02:26:38.000 --> 02:26:42.000
246 lines of CSS for us.

02:26:42.000 --> 02:26:45.000
Well, did nesting by default, that's nice.

02:26:45.000 --> 02:26:48.000
Why do you, why do you think it did that?

02:26:48.000 --> 02:26:49.000
Because it's more efficient. Oh, look at this!

02:26:49.000 --> 02:26:51.000
No, because like…

02:26:51.000 --> 02:26:55.000
Because you have it in your instructions, that's why.

02:26:55.000 --> 02:26:56.000
Hey, look, guys!

02:26:56.000 --> 02:26:58.000
Because I told it to. Otherwise, it…

02:26:58.000 --> 02:26:59.000
Look at that.

02:26:59.000 --> 02:27:00.000
Software's never done.

02:27:00.000 --> 02:27:05.000
It's fixed.

02:27:05.000 --> 02:27:12.000
Let's see if I can swipe. I sure can.

02:27:12.000 --> 02:27:16.000
I noticed one thing, it doesn't appear to wrap around, does it?

02:27:16.000 --> 02:27:17.000
Yup, it does not wrap around.

02:27:17.000 --> 02:27:21.000
David.

02:27:21.000 --> 02:27:22.000
I know. I'm going to.

02:27:22.000 --> 02:27:24.000
Tell it to.

02:27:24.000 --> 02:27:28.000
Probably gonna cost you about 10 cents, though.

02:27:28.000 --> 02:27:31.000
It's only cost him a hundred bucks.

02:27:31.000 --> 02:27:33.000
It's not gonna cost them any more.

02:27:33.000 --> 02:27:39.000
You can play with this world clock day and night, all week, and you still wouldn't hit the limit now.

02:27:39.000 --> 02:27:43.000
Yeah, we're kind of subsidized, though. Like, if we were paying as you go…

02:27:43.000 --> 02:27:46.000
It could easily be, like, something like 10 cents. Like you said,

02:27:46.000 --> 02:27:47.000
Yeah, it's true.

02:27:47.000 --> 02:27:51.000
Uh, like, your dance thing is, like, about 50 cents a run.

02:27:51.000 --> 02:27:52.000
Yeah. Can we show that?

02:27:52.000 --> 02:27:53.000
Um, and these…

02:27:53.000 --> 02:27:56.000
these packages…

02:27:56.000 --> 02:28:07.000
Um, or only for, like, personal use. So if you've got a website that's using AI, it's gonna always use the API costs.

02:28:07.000 --> 02:28:08.000
Yep.

02:28:08.000 --> 02:28:10.000
Right.

02:28:10.000 --> 02:28:18.000
Okay, so I told her I'd like to have the clocks wrap around when I get to the far right or left.

02:28:18.000 --> 02:28:21.000
Let's see what happens.

02:28:21.000 --> 02:28:22.000
Okay, Jan, is there anything you want to point out about the CSS?

02:28:22.000 --> 02:28:27.000
Yeah, it's not using BIM. It's nice semantic.

02:28:27.000 --> 02:28:31.000
CSS, not using a bunch of utility, um, classes.

02:28:31.000 --> 02:28:33.000
like Tailwind, or that sort of thing. Yup.

02:28:33.000 --> 02:28:38.000
Right. And that's all because of the skill that I, you know, I told it not to do all that stuff.

02:28:38.000 --> 02:28:39.000
So, good.

02:28:39.000 --> 02:28:42.000
Right. Yeah. So that looks good.

02:28:42.000 --> 02:28:46.000
There's the JavaScript for those who want to see it.

02:28:46.000 --> 02:28:47.000
That's only hundreds.

02:28:47.000 --> 02:28:51.000
and it it made some of the clocks the dark before night, and some.

02:28:51.000 --> 02:28:54.000
Yes. That was in the instructions.

02:28:54.000 --> 02:28:55.000
If it's… and remember it said it's 6 PM to 6am. That's considered night.

02:28:55.000 --> 02:29:00.000
Okay, I didn't didn't see that.

02:29:00.000 --> 02:29:01.000
We didn't want to do anything super fancy.

02:29:01.000 --> 02:29:03.000
Right. Okay.

02:29:03.000 --> 02:29:05.000
And here's the HTML.

02:29:05.000 --> 02:29:10.000
Look at that, it's only 24 lines long.

02:29:10.000 --> 02:29:13.000
Very nice.

02:29:13.000 --> 02:29:17.000
Though it did all this…

02:29:17.000 --> 02:29:19.000
These are our specs, here's the plans.

02:29:19.000 --> 02:29:25.000
Save for us?

02:29:25.000 --> 02:29:28.000
Okay, let's see… it's still thinking, and…

02:29:28.000 --> 02:29:30.000
going on about that.

02:29:30.000 --> 02:29:36.000
So again, we're close to being done. We've got a few other things we might want to add. Jan's and I.

02:29:36.000 --> 02:29:39.000
thought of some additional features.

02:29:39.000 --> 02:29:41.000
that we might want to do.

02:29:41.000 --> 02:29:46.000
Um, so, but what we have any thoughts or comments?

02:29:46.000 --> 02:29:52.000
And if we're still waiting, I'll start going around saying, you know, what'd you guys find useful, what'd you guys find interesting?

02:29:52.000 --> 02:29:54.000
Scott, did you want me to show the lock thing?

02:29:54.000 --> 02:29:55.000
Yeah, why don't you show the lock thing for a minute?

02:29:55.000 --> 02:29:58.000
With the integrated…

02:29:58.000 --> 02:29:59.000
AI feature.

02:29:59.000 --> 02:30:00.000
Sure.

02:30:00.000 --> 02:30:01.000
Okay.

02:30:01.000 --> 02:30:06.000
I'll quit sharing.

02:30:06.000 --> 02:30:07.000
There you go.

02:30:07.000 --> 02:30:13.000
Okay. Um…

02:30:13.000 --> 02:30:21.000
Be right back.

02:30:21.000 --> 02:30:26.000
Are you guys able to see this?

02:30:26.000 --> 02:30:28.000
Thank you.

02:30:28.000 --> 02:30:29.000
Lots of nice locks.

02:30:29.000 --> 02:30:30.000
Yeah, I can see the locks.

02:30:30.000 --> 02:30:38.000
Okay, so my dad, you know, started this lock and key collection when I was born, you know, with the intent of passing it down, you know, when he passes away, etc.

02:30:38.000 --> 02:30:41.000
And it's been a hobby of his for a long time.

02:30:41.000 --> 02:30:43.000
He, he.

02:30:43.000 --> 02:30:48.000
tracked all of this by just taking pictures of it and having pictures in a folder, and then having a spreadsheet.

02:30:48.000 --> 02:30:50.000
So from that spreadsheet, I worked on developing a taxonomy,

02:30:50.000 --> 02:30:53.000
I don't…

02:30:53.000 --> 02:30:59.000
You know, for all of these dimensions, these kind of search dimensions for the locks, you know, what kind of mechanism it has and whatnot.

02:30:59.000 --> 02:31:03.000
And then this has just a basic…

02:31:03.000 --> 02:31:10.000
faceted search like you'd have on Amazon, so I can just, yeah, you know, I could check several of these and show those from there.

02:31:10.000 --> 02:31:14.000
But the real neat thing in here is the AI.

02:31:14.000 --> 02:31:19.000
research. So, if I go in and edit this lock, you know, it's got some pictures,

02:31:19.000 --> 02:31:21.000
And so I have built…

02:31:21.000 --> 02:31:22.000
I built a configurable agent, and when I say configurable, you can just go into the settings of the site.

02:31:22.000 --> 02:31:26.000
So I…

02:31:26.000 --> 02:31:32.000
And adjust the taxonomy definitions, add new taxonomy things,

02:31:32.000 --> 02:31:36.000
give it new directions for what it should do when it's researching things.

02:31:36.000 --> 02:31:41.000
Um, so it combines that agent with the information that's already in here.

02:31:41.000 --> 02:31:46.000
So, this information that's in here for this lock is just directly out of the spreadsheet.

02:31:46.000 --> 02:31:49.000
that dad put this stuff in here,

02:31:49.000 --> 02:31:51.000
Um, in the spreadsheet, and…

02:31:51.000 --> 02:31:54.000
Here it is. And so you can see…

02:31:54.000 --> 02:31:56.000
it's not particularly…

02:31:56.000 --> 02:31:59.000
Complete, um…

02:31:59.000 --> 02:32:05.000
It's best if you fill in those things that you know are true, so I'm gonna say this is definitely a pancake lock.

02:32:05.000 --> 02:32:10.000
Um, and it looks like it is brass and steel.

02:32:10.000 --> 02:32:14.000
Uh, it definitely has a push key.

02:32:14.000 --> 02:32:16.000
Yeah, let's say that's good enough.

02:32:16.000 --> 02:32:21.000
So, then I'm gonna have my little magic AI button here, and it fires up the…

02:32:21.000 --> 02:32:24.000
The agent.

02:32:24.000 --> 02:32:27.000
And so it goes through a whole little methodology of its own.

02:32:27.000 --> 02:32:31.000
I have some research corpus that I pulled into the, into the

02:32:31.000 --> 02:32:32.000
It can search its own reference to see if it spots anything.

02:32:32.000 --> 02:32:36.000
And actually…

02:32:36.000 --> 02:32:40.000
Um, I have about 7,500 items in the local

02:32:40.000 --> 02:32:44.000
reference where I just grabbed data from…

02:32:44.000 --> 02:32:48.000
sites on the internet.

02:32:48.000 --> 02:32:57.000
Because there wasn't a good lot site.

02:32:57.000 --> 02:32:58.000
So that's how much it's spent so far out of my limit that I set for each lock research.

02:32:58.000 --> 02:33:04.000
What is the 12 cents and $2 down at the bottom?

02:33:04.000 --> 02:33:17.000
So the agent's running out there, doing… it's using a web search tool to search the web for more information about this particular lock. If it finds some information, it'll come back, and then based on that information, it can run out and do some more search.

02:33:17.000 --> 02:33:20.000
So it's kind of run down…

02:33:20.000 --> 02:33:23.000
lines and whatnot.

02:33:23.000 --> 02:33:29.000
And so it's 12 cents up to, up to this point. It'll be more than $0.12 before it's done.

02:33:29.000 --> 02:33:33.000
And these, these searches can take up to a couple minutes.

02:33:33.000 --> 02:33:43.000
Oh, okay, so it's done. So this is the UI did. So all of the fields that it is… put suggestions in get highlighted. So it suggested a new name,

02:33:43.000 --> 02:33:46.000
Manufacturer, date, value,

02:33:46.000 --> 02:33:50.000
said you should put the diameter on there.

02:33:50.000 --> 02:33:53.000
And then it updated the… the, uh…

02:33:53.000 --> 02:33:54.000
Description.

02:33:54.000 --> 02:33:59.000
Subscription.

02:33:59.000 --> 02:34:05.000
All of those things are true.

02:34:05.000 --> 02:34:10.000
It got the right information.

02:34:10.000 --> 02:34:14.000
It found some information about the model.

02:34:14.000 --> 02:34:16.000
And that's correct. It was not a pen tumbler. This is a lever lock.

02:34:16.000 --> 02:34:21.000
Correct.

02:34:21.000 --> 02:34:24.000
So, so right on the front of the lock.

02:34:24.000 --> 02:34:27.000
find some comparable sales online…

02:34:27.000 --> 02:34:32.000
To adjust it, yep, it updated said it was pin tumbler, now it's lever.

02:34:32.000 --> 02:34:35.000
It's sudden.

02:34:35.000 --> 02:34:38.000
Time zone, you know, era,

02:34:38.000 --> 02:34:42.000
And it is in excellent condition, thank you very much.

02:34:42.000 --> 02:34:48.000
There you go, so, and then, and so then it gives you a report of why that gives you the reasoning for why it made all those changes.

02:34:48.000 --> 02:34:53.000
So it's like, here are the changes, here are the notes, and here's the reasoning. So this is ephemeral.

02:34:53.000 --> 02:34:58.000
So this report is ephemeral, it just tells you why it did it, so you can understand the choices that it made.

02:34:58.000 --> 02:35:05.000
And then if you disagree, or you tell it has anything that's wrong, I can just do corrections here, say, no, no,

02:35:05.000 --> 02:35:06.000
It's actually this, I'm sure that it's from, you know, 1870 or whatever.

02:35:06.000 --> 02:35:09.000
Because they don't know.

02:35:09.000 --> 02:35:14.000
Where you type in corrections for anything that you know that it got wrong, or you could even give it directions to

02:35:14.000 --> 02:35:16.000
Go back and look more into this.

02:35:16.000 --> 02:35:18.000
You know, note this…

02:35:18.000 --> 02:35:25.000
Whatever direction you want to give, and then you can request a revision where it then does a little revision loop, but in this case, it did a great job, so I'm just going to say

02:35:25.000 --> 02:35:29.000
Accept all those things, and then I can edit it directly if I want, but I'm not going to.

02:35:29.000 --> 02:35:31.000
And I'll save it. So…

02:35:31.000 --> 02:35:33.000
Now, now that has a nice.

02:35:33.000 --> 02:35:34.000
a nice updated record of this… of this lock. Way better than the data that was there previously.

02:35:34.000 --> 02:35:39.000
Does it keep?

02:35:39.000 --> 02:35:46.000
Does it keep that that logic reasoning in somewhere, or just change the data fields?

02:35:46.000 --> 02:35:51.000
which changed the beta fields.

02:35:51.000 --> 02:35:54.000
Yeah, Steve Fedman, your mic's…

02:35:54.000 --> 02:35:55.000
Crazy feedback.

02:35:55.000 --> 02:35:56.000
Can Steve Stegman mute his microphone? It's interfering.

02:35:56.000 --> 02:35:57.000
Oh, I'm sorry.

02:35:57.000 --> 02:36:03.000
Yeah, the… no, the research report that it did was ephemeral. It saves all the data.

02:36:03.000 --> 02:36:04.000
Oh, okay. Thank you.

02:36:04.000 --> 02:36:12.000
And it grabs a lot of the stuff out of that report, and it adds them to these research notes. So many of the things that were in the report are here, but this has been… this is…

02:36:12.000 --> 02:36:22.000
Notes from that research, whereas the report is for a different purpose, to let you know how it came to this information.

02:36:22.000 --> 02:36:23.000
And Jan's picked up the fonts and the design.

02:36:23.000 --> 02:36:24.000
Okay, okay, thank you.

02:36:24.000 --> 02:36:26.000
And then… Right.

02:36:26.000 --> 02:36:36.000
And like I said, you know, this is all configurable, so here's this AI guide is a big chunk of the agent that says, hey, you're a research assistant agent for an antique bag logging.

02:36:36.000 --> 02:36:42.000
In key collection, um, these are your research standards, these are the places where you should look and prefer.

02:36:42.000 --> 02:36:46.000
prefer the higher things in this hierarchy for your data sources.

02:36:46.000 --> 02:36:51.000
It just gives it guidance for how it should perform its job.

02:36:51.000 --> 02:36:56.000
And then in addition to that, it defines all of those taxonomies. So it says, you know, hey, material.

02:36:56.000 --> 02:37:00.000
Aluminum is lightweight silver and it tells when, when.

02:37:00.000 --> 02:37:03.000
Padlock bodies started being created with aluminum.

02:37:03.000 --> 02:37:06.000
etc. So it can make better decisions.

02:37:06.000 --> 02:37:13.000
And if I wanted to, I could add more stuff here. My dad can go in there and adjust and add a new tag. For instance, he just added

02:37:13.000 --> 02:37:19.000
Switch locks and signal locks, which are he wanted a couple extra tags for railroad locks.

02:37:19.000 --> 02:37:25.000
And so that it can tag things properly as a signal lock, and signal lock is this.

02:37:25.000 --> 02:37:29.000
You know, how to tell a signal lock from something else and.

02:37:29.000 --> 02:37:31.000
Um, but…

02:37:31.000 --> 02:37:36.000
And so that's 1 of the reason to take so much to do this is we're shoving a lot of that context in.

02:37:36.000 --> 02:37:38.000
It takes a lot of those tokens in,

02:37:38.000 --> 02:37:41.000
Um, to, to do its job.

02:37:41.000 --> 02:37:48.000
And I didn't, I didn't even remember, it was on that research report also how much it cost to run that report.

02:37:48.000 --> 02:37:51.000
Did anybody notice how much it was cost? I didn't.

02:37:51.000 --> 02:37:54.000
pay attention when I looked at it.

02:37:54.000 --> 02:37:56.000
I saw 12 cents, and that was it.

02:37:56.000 --> 02:38:01.000
Yeah, I do have stats I'm tracking, and I can see how much dad has spent on this.

02:38:01.000 --> 02:38:03.000
You know…

02:38:03.000 --> 02:38:05.000
150 bucks.

02:38:05.000 --> 02:38:07.000
140 bucks.

02:38:07.000 --> 02:38:09.000
You know, and that's…

02:38:09.000 --> 02:38:14.000
Where do you get old locks like this? It's an amazing collection.

02:38:14.000 --> 02:38:17.000
Well, dad's been collecting it since, you know, for 58 years.

02:38:17.000 --> 02:38:18.000
Sales.

02:38:18.000 --> 02:38:19.000
So.

02:38:19.000 --> 02:38:21.000
The bright tails and things like that? Yeah.

02:38:21.000 --> 02:38:26.000
And, and taking, you know, estate sales, antique stores, auctions, you know.

02:38:26.000 --> 02:38:29.000
Et cetera. When he's traveling.

02:38:29.000 --> 02:38:30.000
I mean, he's gotten stuff from Africa and he's gotten.

02:38:30.000 --> 02:38:34.000
Are…

02:38:34.000 --> 02:38:38.000
Now, these are this data is just drawn from pictures that he's found. Is that it, or actual lock?

02:38:38.000 --> 02:38:42.000
Well, no, these are photographs of the locks that he's collected.

02:38:42.000 --> 02:38:46.000
These are photographs that dad took of the items in his collection.

02:38:46.000 --> 02:38:47.000
And then, for this item, he put this information in the spreadsheet.

02:38:47.000 --> 02:38:52.000
Okay.

02:38:52.000 --> 02:38:57.000
But someone could do this similar kind of thing just from pictures, right?

02:38:57.000 --> 02:38:59.000
Um, it actually is pretty poor.

02:38:59.000 --> 02:39:07.000
What a job it does from looking at the picture. It's helpful, but giving it this text.

02:39:07.000 --> 02:39:08.000
Gives it a lot. Yeah, it helps it a lot.

02:39:08.000 --> 02:39:12.000
Additionally. Okay. Okay. So.

02:39:12.000 --> 02:39:17.000
It helps a lot, just, just going from an image, it has a really pretty hard time.

02:39:17.000 --> 02:39:26.000
Because that. I remember a couple of years back talking to a guy here in St. Louis that worked for.

02:39:26.000 --> 02:39:36.000
a company that did labels for beer. And he had collections going back into the 30s and even earlier.

02:39:36.000 --> 02:39:47.000
And you know. It was quite interesting, and I… recommended that he put that collection up on Pinterest, but he never did, that I know of.

02:39:47.000 --> 02:39:48.000
But collections like this can be very interesting, I think.

02:39:48.000 --> 02:40:01.000
Yeah. So the description for this when it said that at Rock Island lanes and that it has signal in a cursive script below.

02:40:01.000 --> 02:40:02.000
Mm-hmm.

02:40:02.000 --> 02:40:08.000
It did recognize all that and add that description. So it was able to read the text on the lock from the image.

02:40:08.000 --> 02:40:09.000
Right.

02:40:09.000 --> 02:40:14.000
of Rock Island lines and signal and differentiate that it was, you know, this big print followed by that.

02:40:14.000 --> 02:40:15.000
And that information, you know, that text information,

02:40:15.000 --> 02:40:17.000
Right.

02:40:17.000 --> 02:40:22.000
is really important to the agent to then include that text in web searches.

02:40:22.000 --> 02:40:24.000
To find out more information about it.

02:40:24.000 --> 02:40:25.000
So…

02:40:25.000 --> 02:40:28.000
Hey, Jen, before you leave, go back to that one real quick.

02:40:28.000 --> 02:40:29.000
Sure.

02:40:29.000 --> 02:40:33.000
The key hasn't been cut. Does that mean you don't have a key to the lock?

02:40:33.000 --> 02:40:39.000
Um, no, actually, the key's cut is just kind of hard to see on that wooden grain.

02:40:39.000 --> 02:40:40.000
That was a good pitch, though, Lee. That was a good catch.

02:40:40.000 --> 02:40:41.000
Yeah.

02:40:41.000 --> 02:40:45.000
Yeah, I see the I see. Okay.

02:40:45.000 --> 02:40:52.000
Yeah, that's what the cylinder key. It has that, that's a newer design for

02:40:52.000 --> 02:40:56.000
Dad's padlocks.

02:40:56.000 --> 02:40:57.000
He looks pretty modern, actually.

02:40:57.000 --> 02:41:05.000
You know, these flat steel keys were a big thing for a long time and before that you had these barrel keys that went over a post, they'll also call them a post key.

02:41:05.000 --> 02:41:15.000
And then a bit key is what goes into something that doesn't have that post, you know, it'd be just like that, like a skeleton key that doesn't have the post in it, like that. That would be a bit key right here.

02:41:15.000 --> 02:41:17.000
Or it's just a solid cylinder.

02:41:17.000 --> 02:41:21.000
With a bit.

02:41:21.000 --> 02:41:24.000
My dad has these up all around the house on wooden boards and that kind of stuff.

02:41:24.000 --> 02:41:27.000
There's a lot of them in the house, it's pretty amazing.

02:41:27.000 --> 02:41:28.000
It really is.

02:41:28.000 --> 02:41:30.000
It's like walking key museum in our living room.

02:41:30.000 --> 02:41:32.000
Yeah.

02:41:32.000 --> 02:41:36.000
So how long would it be before you have a skinny robot to walk around and take them down and clean them for you?

02:41:36.000 --> 02:41:38.000
Yeah, yeah, neat.

02:41:38.000 --> 02:41:47.000
All right, so there's that thing. So that's an example of AI integration in a way that

02:41:47.000 --> 02:41:48.000
Might be useful.

02:41:48.000 --> 02:41:50.000
How long did it take you to build the…

02:41:50.000 --> 02:41:55.000
the basic system. Now, I know it was kind of leapfrogging on the recipe creator you'd built previously.

02:41:55.000 --> 02:41:58.000
Yeah, I don't know, Scott, I've spent a lot of time on this.

02:41:58.000 --> 02:42:04.000
You know, for the last round of it, I did a whole bunch of refinement on it. It took me.

02:42:04.000 --> 02:42:09.000
All weekend. You know, and you would have considered it done before then.

02:42:09.000 --> 02:42:12.000
you know, before that time I spent on it.

02:42:12.000 --> 02:42:16.000
That's when I did that looping agent and all the…

02:42:16.000 --> 02:42:18.000
the price tracking and…

02:42:18.000 --> 02:42:21.000
refinement. So I've spent a lot of time refining that agent.

02:42:21.000 --> 02:42:22.000
But here's the other question. How long would it take you if you did it by hand?

02:42:22.000 --> 02:42:25.000
Hmm.

02:42:25.000 --> 02:42:31.000
Oh god, ages. I mean, I've been talking to dad about doing this for years.

02:42:31.000 --> 02:42:32.000
And you never did it, because you were like, this is gonna take forever.

02:42:32.000 --> 02:42:39.000
It never did it, you know, because it could take so long and so much effort.

02:42:39.000 --> 02:42:42.000
So, I do not know how to stop sharing.

02:42:42.000 --> 02:42:43.000
the screen.

02:42:43.000 --> 02:42:47.000
Uh, there should be a bow, a row above… hit escape.

02:42:47.000 --> 02:42:49.000
And that should show the bar if you hit it.

02:42:49.000 --> 02:42:51.000
No.

02:42:51.000 --> 02:42:56.000
up in the tab above, hit the little X up there.

02:42:56.000 --> 02:42:59.000
I'm…

02:42:59.000 --> 02:43:03.000
Yeah, I've just got a green…

02:43:03.000 --> 02:43:06.000
Band around the window that I'm sharing.

02:43:06.000 --> 02:43:07.000
You should.

02:43:07.000 --> 02:43:08.000
Oh, to stop sharing. Or are just…

02:43:08.000 --> 02:43:16.000
You should see, like, a bar with all the buttons on it, and one of them is stop sharing, but you have to escape to see the bar.

02:43:16.000 --> 02:43:19.000
Okay, apparently you found it.

02:43:19.000 --> 02:43:22.000
Okay.

02:43:22.000 --> 02:43:23.000
Alright, let me take…

02:43:23.000 --> 02:43:24.000
Thank you very much.

02:43:24.000 --> 02:43:25.000
Yeah, that's really cool. So, um, everybody,

02:43:25.000 --> 02:43:27.000
Thank you. Yes.

02:43:27.000 --> 02:43:31.000
Uh… I, uh, it did it!

02:43:31.000 --> 02:43:34.000
There you go. Um…

02:43:34.000 --> 02:43:36.000
And I was testing it.

02:43:36.000 --> 02:43:40.000
You know, here we go with our thing… with our clocks.

02:43:40.000 --> 02:43:43.000
The only problem is when you get to the end, did you see that little pause?

02:43:43.000 --> 02:43:46.000
So when I get to the far right or far left…

02:43:46.000 --> 02:43:48.000
There's, like, a pause…

02:43:48.000 --> 02:43:51.000
And then, it kind of wraps it.

02:43:51.000 --> 02:43:55.000
And I told it, so here we go…

02:43:55.000 --> 02:43:58.000
There, see that?

02:43:58.000 --> 02:43:59.000
Yes.

02:43:59.000 --> 02:44:08.000
And so, I told it and said, hey, uh, whenever I get to the far right or far left, there's a pause. And it said, okay, and it went… and it said, okay, I fixed it, and I'm like, no, you didn't, not enough.

02:44:08.000 --> 02:44:15.000
It's still there, and so that's what you're seeing here, is it going through to try to fix…

02:44:15.000 --> 02:44:18.000
the problem, and it analyzed the problem.

02:44:18.000 --> 02:44:19.000
Right? Yeah.

02:44:19.000 --> 02:44:22.000
One thing, one thing to know is it can't see motion.

02:44:22.000 --> 02:44:27.000
It takes screenshots of things in Playwright and that kind of stuff, so it has a really hard time.

02:44:27.000 --> 02:44:31.000
With the animation and motion issues.

02:44:31.000 --> 02:44:36.000
Yep. But it's doing pretty good so far, we'll see.

02:44:36.000 --> 02:44:40.000
It's using playwright, so, you know, it's doing it itself.

02:44:40.000 --> 02:44:48.000
But I can still interact with it and look at it, but it's doing it itself. Now, the next thing I would add is, I want a button that takes me back to my home clock.

02:44:48.000 --> 02:44:49.000
Sure.

02:44:49.000 --> 02:44:51.000
From anywhere.

02:44:51.000 --> 02:44:54.000
So, I can ask it that.

02:44:54.000 --> 02:44:55.000
But again, while we're waiting, damn, that was awesome.

02:44:55.000 --> 02:44:57.000
Yeah.

02:44:57.000 --> 02:45:01.000
Uh, can you… why don't we go around and see what you guys found interesting, uh, about this?

02:45:01.000 --> 02:45:03.000
what you're taking away from it,

02:45:03.000 --> 02:45:08.000
Um, and uh, you know, while this thing keeps chugging away,

02:45:08.000 --> 02:45:18.000
Um, so, uh, I'll start with you, Stan, because you're in the top left. What did you find interesting about this, or what are you going to take away?

02:45:18.000 --> 02:45:30.000
It's gonna be… I don't know where I'm gonna put it in my list of undone projects.

02:45:30.000 --> 02:45:31.000
Yeah.

02:45:31.000 --> 02:45:46.000
because I got too many other things going on, but… And the only thing I can think of offhand that might be useful to me is, although I don't have everything, I've been subscribed to analog scientific magazine and since I aged 15.

02:45:46.000 --> 02:45:50.000
and. It might help to catalog those or not.

02:45:50.000 --> 02:45:52.000
Yeah.

02:45:52.000 --> 02:46:03.000
So, when you said… go ahead, go ahead, I'm sorry.

02:46:03.000 --> 02:46:04.000
Nice.

02:46:04.000 --> 02:46:07.000
or or or more likely, I would catalog some 35 millimeter slides that my leftover from my dad and my brothers, etc, etc. And I've got about.

02:46:07.000 --> 02:46:08.000
Nice.

02:46:08.000 --> 02:46:11.000
5,000 of those. Going back to when I was.

02:46:11.000 --> 02:46:13.000
just barely tall enough to walk.

02:46:13.000 --> 02:46:18.000
Now, you said you don't know when you're gonna work it into your many projects to do,

02:46:18.000 --> 02:46:19.000
Just like Jan's just said, and I've experienced this, I know others have…

02:46:19.000 --> 02:46:20.000
Right.

02:46:20.000 --> 02:46:24.000
wire

02:46:24.000 --> 02:46:32.000
But, when you start using this, all of a sudden you're gonna be like, you know what? I can get that thing done that I've been wanting to do for a long time.

02:46:32.000 --> 02:46:33.000
Uh, and it will allow you to do that.

02:46:33.000 --> 02:46:34.000
Yeah. Very, very much.

02:46:34.000 --> 02:46:39.000
So, more people now are getting more projects done.

02:46:39.000 --> 02:46:47.000
Yeah, very much like that little bitty script file I was saying for 7 years. I need to write a script file so I don't have to do the 5 commands from the command line.

02:46:47.000 --> 02:46:50.000
Yeah.

02:46:50.000 --> 02:46:51.000
Yeah. Good. Uh, Brian B., how about you?

02:46:51.000 --> 02:46:53.000
That same kind of thing. Okay.

02:46:53.000 --> 02:46:56.000
What'd you find interesting?

02:46:56.000 --> 02:46:59.000
Uh, loved it all from both of you, and…

02:46:59.000 --> 02:47:02.000
your communications are fantastic, by the way.

02:47:02.000 --> 02:47:07.000
Yeah, I… I like the code generation. Um, I… I…

02:47:07.000 --> 02:47:11.000
don't like doing web stuff, personally, so it's kinda…

02:47:11.000 --> 02:47:14.000
Nice to have something that's going to do it for me, which is good.

02:47:14.000 --> 02:47:17.000
Sure. Totally get that.

02:47:17.000 --> 02:47:21.000
I love the, uh, the ability to…

02:47:21.000 --> 02:47:23.000
improve your productivity.

02:47:23.000 --> 02:47:26.000
Um, it'd be interesting to see how

02:47:26.000 --> 02:47:31.000
Other than, you know, code creation, what people do typically to…

02:47:31.000 --> 02:47:32.000
Uh, improve their…

02:47:32.000 --> 02:47:34.000
I have a good… thank you for asking that, Brian.

02:47:34.000 --> 02:47:35.000
Oh, good.

02:47:35.000 --> 02:47:37.000
I have something to show you. Uh…

02:47:37.000 --> 02:47:38.000
Awesome.

02:47:38.000 --> 02:47:39.000
So…

02:47:39.000 --> 02:47:43.000
I could… I could… I could tell you, too, like, my mother-in-law had a

02:47:43.000 --> 02:47:45.000
USB drive.

02:47:45.000 --> 02:47:50.000
that had a big dump of photos and images on it from her old phone.

02:47:50.000 --> 02:47:53.000
That that was done through some sort of.

02:47:53.000 --> 02:47:56.000
Windows backup thing.

02:47:56.000 --> 02:48:01.000
And I was like, well, and she's like, I don't know how to get these off here, or what to do with it.

02:48:01.000 --> 02:48:04.000
I had Claude Code help me pull it out,

02:48:04.000 --> 02:48:07.000
go through all the metadata for all the photos.

02:48:07.000 --> 02:48:09.000
Organize it by date,

02:48:09.000 --> 02:48:14.000
Um, and… and label it and organize it for her.

02:48:14.000 --> 02:48:17.000
Man, was that… was that handy?

02:48:17.000 --> 02:48:24.000
getting all that stuff off there, it would have taken me hours and hours and hours and hours, and all the files got renamed from, you know, the horrible names that…

02:48:24.000 --> 02:48:29.000
That files get from photos, you know, DCS, whatever it was, and that kind of thing.

02:48:29.000 --> 02:48:35.000
And gave her back a nice set of images that, you know, she thought she had lost forever of her grandkids and whatnot. So that's awesome.

02:48:35.000 --> 02:48:36.000
That kind of thing.

02:48:36.000 --> 02:48:37.000
Yeah. So, uh, here's something else.

02:48:37.000 --> 02:48:39.000
That is awesome. Thank you. I'd need to do that myself.

02:48:39.000 --> 02:48:47.000
Here's something else we've used it for. Uh, this was Janz's idea and his baby. So, we have… we host websites, as you guys know,

02:48:47.000 --> 02:48:54.000
And we decided to start setting… we've… James and I have always taken a lot of notes as to how to set up our servers.

02:48:54.000 --> 02:49:02.000
And Jan's had the idea a couple of months ago, uh, why don't we see if Claude can help automate that whole process, because Claude…

02:49:02.000 --> 02:49:05.000
can use SSH on your computer,

02:49:05.000 --> 02:49:08.000
Connect to a server and run commands.

02:49:08.000 --> 02:49:14.000
And so, we've now done that on every single one of our servers. So we have a server called Ariadne,

02:49:14.000 --> 02:49:16.000
And Janz wrote,

02:49:16.000 --> 02:49:21.000
This document here, and this is a document that…

02:49:21.000 --> 02:49:24.000
tells it everything it needs to do.

02:49:24.000 --> 02:49:28.000
To set up the server. Now, a lot of this was… he copied and pasted from…

02:49:28.000 --> 02:49:32.000
the documents he and I have generated over the years.

02:49:32.000 --> 02:49:35.000
I'm misspeaking, Jans, I know you'll correct me.

02:49:35.000 --> 02:49:38.000
And so, this tells it every single step,

02:49:38.000 --> 02:49:43.000
He wants to do, and so here are the…

02:49:43.000 --> 02:49:46.000
20 steps.

02:49:46.000 --> 02:49:49.000
that it's gonna walk through and do.

02:49:49.000 --> 02:49:55.000
And… this document, these are follow-up items, things later.

02:49:55.000 --> 02:49:58.000
Um, is the Claude MD vans the one that says…

02:49:58.000 --> 02:50:02.000
No, you should look at the README.

02:50:02.000 --> 02:50:04.000
The README.

02:50:04.000 --> 02:50:08.000
Right. So, you want to quickly walk through this?

02:50:08.000 --> 02:50:14.000
So, so yeah, this is a non-coding project. So this project is for server administration.

02:50:14.000 --> 02:50:18.000
And AI-assisted server administration of this server, okay?

02:50:18.000 --> 02:50:25.000
And so, I write with this README for me and Scott to say, what is this project for? So it's a guide document,

02:50:25.000 --> 02:50:27.000
And collaborating on on.

02:50:27.000 --> 02:50:30.000
administering that server, and documentation is…

02:50:30.000 --> 02:50:32.000
Super important there. So we say, hey,

02:50:32.000 --> 02:50:39.000
Anytime we do stuff, we always write down what we did and how to do it so we can repeat it, and we can go back and fix it and figure out, what do we do here that

02:50:39.000 --> 02:50:42.000
that has messed up, or whatever the case may be.

02:50:42.000 --> 02:50:44.000
So, I say, document it.

02:50:44.000 --> 02:50:50.000
One of the things that it does, and Scott, and I would do really bad about it, is when we would work on something together.

02:50:50.000 --> 02:50:58.000
We would forget when, you know, when did, when did we change install that thing on the server? When did we do that? Because it takes us a lot of time to write everything down.

02:50:58.000 --> 02:51:03.000
So when we're dealing with this, then it creates session logs every time I'm working with it,

02:51:03.000 --> 02:51:08.000
For when we installed this and when we set up this new thing, and when we added this filter to fail to ban, and when.

02:51:08.000 --> 02:51:12.000
You know, all this different stuff that we did, so we have a record.

02:51:12.000 --> 02:51:15.000
of everything we did on the server in our session logs.

02:51:15.000 --> 02:51:18.000
And we have a.

02:51:18.000 --> 02:51:20.000
And it can…

02:51:20.000 --> 02:51:31.000
guide us to looking at things, you know? So I'd say, what are my options with NGINX for handling, you know, routing on whatever? And it runs through that stuff. So it does a really good job of finding documentation.

02:51:31.000 --> 02:51:34.000
Instead of us digging through NGINX docs.

02:51:34.000 --> 02:51:37.000
You know, and of course, we ask it for a…

02:51:37.000 --> 02:51:44.000
Um, references. So, it's all set up in here. I have hooks so that every time it does any kind of bash command,

02:51:44.000 --> 02:51:48.000
It does the command and the response and stores it in a bash log.

02:51:48.000 --> 02:51:52.000
So, if I have it do anything, I can watch every… it's like…

02:51:52.000 --> 02:51:58.000
It's like a T-Mux kind of thing, where I'm watching every command and response from the server that it's doing,

02:51:58.000 --> 02:52:03.000
So I can both learn from it and watch what's going on.

02:52:03.000 --> 02:52:05.000
And monitor it.

02:52:05.000 --> 02:52:06.000
So, that would be in here, right?

02:52:06.000 --> 02:52:10.000
their business section law that talks about talks about what was done on that server.

02:52:10.000 --> 02:52:12.000
Um.

02:52:12.000 --> 02:52:13.000
et cetera.

02:52:13.000 --> 02:52:17.000
Yeah, so here's an example right here. Session logs from March 18th.

02:52:17.000 --> 02:52:23.000
Right? Continue server setup, steps 15 and 17.

02:52:23.000 --> 02:52:28.000
Plus control panel build-out, and so it goes through and tells you every single thing it did here.

02:52:28.000 --> 02:52:31.000
That's not everything, that's everything that I did.

02:52:31.000 --> 02:52:32.000
Oh.

02:52:32.000 --> 02:52:35.000
That it assisted me with, thank you very much.

02:52:35.000 --> 02:52:37.000
You're right.

02:52:37.000 --> 02:52:39.000
Sorry.

02:52:39.000 --> 02:52:43.000
Anyway, that's another cool example. Again, not coding!

02:52:43.000 --> 02:52:47.000
But… makes life far, far, far easier.

02:52:47.000 --> 02:52:48.000
for us to do.

02:52:48.000 --> 02:52:57.000
Yeah. So I have a server admin project for each one of our servers, plus I have a dev local project that keeps that I use.

02:52:57.000 --> 02:52:59.000
for keeping…

02:52:59.000 --> 02:53:04.000
my own workstation administered. So it's a project that I work out of.

02:53:04.000 --> 02:53:07.000
So it can help me administer my own workstation.

02:53:07.000 --> 02:53:10.000
And it has a drift check every time I fire it up, it checks to make sure

02:53:10.000 --> 02:53:14.000
that any configuration changes that I made get backed up.

02:53:14.000 --> 02:53:18.000
and recorded, so that if I install, you know,

02:53:18.000 --> 02:53:22.000
It it it it watches my packages, so if I do stuff with home brew.

02:53:22.000 --> 02:53:24.000
or PIP, or, you know.

02:53:24.000 --> 02:53:26.000
Anything like that, it it.

02:53:26.000 --> 02:53:33.000
keeps a record of that, so it's essentially a build book, so if I… my machine gets struck by lightning, or if I get a new computer,

02:53:33.000 --> 02:53:37.000
I can easily set up the new machine to mirror my current workstation.

02:53:37.000 --> 02:53:38.000
Good. By the way, it…

02:53:38.000 --> 02:53:40.000
That's all awesome.

02:53:40.000 --> 02:53:45.000
It added the, uh, home button, so I told it, hey, I want a button that goes back to my current clock.

02:53:45.000 --> 02:53:48.000
So it did that, so if I'm, you know…

02:53:48.000 --> 02:53:51.000
over here, and I click it.

02:53:51.000 --> 02:53:54.000
Takes me right back. That's great! I think it's a little too small.

02:53:54.000 --> 02:53:57.000
So, I said, can you make it a bit bigger?

02:53:57.000 --> 02:53:58.000
Well, you could have done that.

02:53:58.000 --> 02:54:01.000
I know it could have done that, but I'm demonstrating.

02:54:01.000 --> 02:54:04.000
Of course I could have done that.

02:54:04.000 --> 02:54:06.000
So, done.

02:54:06.000 --> 02:54:09.000
And… it got a little bigger.

02:54:09.000 --> 02:54:12.000
I probably would go in there manually, and like Jan said, fix it.

02:54:12.000 --> 02:54:14.000
Yeah.

02:54:14.000 --> 02:54:18.000
But, easy enough to have it do it. If I feel lazy.

02:54:18.000 --> 02:54:24.000
Okay, uh, Wayne, what did you find interesting?

02:54:24.000 --> 02:54:40.000
Well, I find the whole thing interesting. The whole AI chat thing is going to. I can see where there's going to be a lot of people that are become unemployed or they're going to have to find something else to do. I'm going to try and get my grandkids to become plumbers.

02:54:40.000 --> 02:54:42.000
I agree.

02:54:42.000 --> 02:55:07.000
But the power is wonderful. And of course, the training and what it's trained on is going to… is going to dictate what it comes up with for an answer. In other words, you know, there's a whole lot of things that are controversial and it needs to be trained on both sides of those.

02:55:07.000 --> 02:55:20.000
Questions, so that I can see both sides of the… in an answer, see both sides, and come to my own conclusion on things. I don't want AI to be deciding things for me.

02:55:20.000 --> 02:55:50.000
Um, so, um, yeah, but, uh, yeah, the tools are wonderful and I've been using it a lot myself. Uh, I've been working on and off on a script for a couple of years now that has actually gotten grown to a gargantuan proportions because as an engineer, I keep adding stuff to it, you know, I can't stop.

02:55:53.000 --> 02:55:54.000
Good. Good.

02:55:54.000 --> 02:56:02.000
And uh… and I wish I had started using AI a long time ago, because I've done about 80% on my own, and now using AI is making the last 20% much faster and much easier. And I'm feeding it bits and pieces of what I've come up with, and it's, uh… And making improvements to it. So good, very good. I'm very happy with it.

02:56:02.000 --> 02:56:04.000
Very cool! Excellent.

02:56:04.000 --> 02:56:13.000
Vincent, how about you?

02:56:13.000 --> 02:56:18.000
Okay, no problem. Uh, Ben's iPad?

02:56:18.000 --> 02:56:19.000
Cool. Ben's iPad?

02:56:19.000 --> 02:56:24.000
Vincent seldom talks. I'm sure he enjoyed it.

02:56:24.000 --> 02:56:25.000
Okay. Gary Meyer, how about you?

02:56:25.000 --> 02:56:28.000
Yeah, I think so.

02:56:28.000 --> 02:56:43.000
boy, I yeah, blah blah blah. That's kind of the way my brain is right now. There is so much good stuff that I think I picked up tonight that just how to go about.

02:56:43.000 --> 02:56:50.000
setting up the whole project and and the step-by-step of going through it. The one thing that I think.

02:56:50.000 --> 02:56:58.000
Probably be my most valuable comment right now is I was putting some stuff on the calendar, uh, and uh.

02:56:58.000 --> 02:57:02.000
I'll try and share my calendar here on screen.

02:57:02.000 --> 02:57:05.000
Do you want me to unshare?

02:57:05.000 --> 02:57:19.000
Yeah. Anyway, go to go to tomorrow's calendar if you can. There were 3 things that I found at meetup. And.

02:57:19.000 --> 02:57:20.000
Do I go?

02:57:20.000 --> 02:57:27.000
One of them is ladies in AI. So obviously a bunch of women want to favor that. Resources calendar.

02:57:27.000 --> 02:57:37.000
The 2 that I really want to point out is one called Quad Code from setup to agentic workflows.

02:57:37.000 --> 02:57:52.000
you know, it sounds like it's covering a lot of the stuff you covered tonight, and the interesting thing there is, and I don't know if this is a one-time deal, it doesn't sound like it sounds like a, you know, multiple occurrences.

02:57:52.000 --> 02:58:10.000
They capped the event at 500 attendees, and it filled up the cap, and it now has 540 people on the wait list to get in tomorrow night. And the other one that I didn't even bother to put on.

02:58:10.000 --> 02:58:14.000
on our calendar, but it's out on meetup for tomorrow night is.

02:58:14.000 --> 02:58:18.000
Click on 19 more. There you go.

02:58:18.000 --> 02:58:36.000
that the the the other one was somebody who's doing it for money, and it sounded very much like the step through of what what Scott and Jan just did for us tonight. The only difference is is if you want to go tomorrow night to their session, there's only 20 people in that class.

02:58:36.000 --> 02:58:45.000
But those 20 people are paying between $280 and $350 to be in the class. I don't know if that's a one-night class, or if it's a multiple.

02:58:45.000 --> 02:58:47.000
Wow.

02:58:47.000 --> 02:58:52.000
But but the information we got tonight was very, very valuable. Thank you, both of you.

02:58:52.000 --> 02:58:58.000
Cool, thank you. Alright, uh… Eric, how about you? What'd you find useful?

02:58:58.000 --> 02:59:02.000
are interesting.

02:59:02.000 --> 02:59:05.000
Anybody? Eric?

02:59:05.000 --> 02:59:09.000
Oh, by the way, I just told it to add a toggle for switching between light mode…

02:59:09.000 --> 02:59:11.000
And, uh, now…

02:59:11.000 --> 02:59:19.000
Uh, that means respect… well, it's wrong, because I think that means respect whatever the system has.

02:59:19.000 --> 02:59:25.000
So, that's dark mode. That should be respecting the system, but it's not working, because my computer's in dark mode.

02:59:25.000 --> 02:59:29.000
So I need to tell it that. Anyway, Eric, you got anything? Or I'll move on?

02:59:29.000 --> 02:59:36.000
Okay, Steve Stickman, how about you?

02:59:36.000 --> 02:59:41.000
Steve Stigman? There you go.

02:59:41.000 --> 02:59:42.000
What'd you find interesting?

02:59:42.000 --> 02:59:48.000
Yeah.

02:59:48.000 --> 02:59:51.000
Oh, okay.

02:59:51.000 --> 02:59:52.000
Okay.

02:59:52.000 --> 02:59:56.000
Is that your… I'm currently on a bus, so I need to get my maybe get another phone. There was a lot to go through. I'll probably have to go back and watch it.

02:59:56.000 --> 02:59:58.000
Okay.

02:59:58.000 --> 03:00:04.000
It found the differences, let's say, of the…

03:00:04.000 --> 03:00:08.000
Eric, we can… I can barely understand what you're saying, I'm sorry, I think the connection's bad.

03:00:08.000 --> 03:00:09.000
But thank you.

03:00:09.000 --> 03:00:12.000
Okay. Can you hear me better now?

03:00:12.000 --> 03:00:13.000
Slightly, for now.

03:00:13.000 --> 03:00:15.000
Thank you.

03:00:15.000 --> 03:00:24.000
Yeah, I said I'll probably have to go back through the… the, uh, PowerPoint.

03:00:24.000 --> 03:00:29.000
Okay. We can't understand what you're saying, Eric, I'm sorry.

03:00:29.000 --> 03:00:30.000
I'm gonna have to move on, sorry.

03:00:30.000 --> 03:00:32.000
Oh.

03:00:32.000 --> 03:00:39.000
Uh, Steve Stegman, what do you got?

03:00:39.000 --> 03:00:43.000
Well…

03:00:43.000 --> 03:00:45.000
Okay, that's good.

03:00:45.000 --> 03:00:51.000
Just kind of amazed at the whole thing. I didn't… I didn't realize all this was out there.

03:00:51.000 --> 03:00:57.000
I haven't done a lot of coding, you know, in the last… Really, I guess the last 10 years, I've done not that much.

03:00:57.000 --> 03:00:59.000
Sure.

03:00:59.000 --> 03:01:00.000
Okay.

03:01:00.000 --> 03:01:04.000
It's fascinating, though. I'd love to get back into it, but, you know, them days is gone.

03:01:04.000 --> 03:01:07.000
Well, remember, it's not just for coding.

03:01:07.000 --> 03:01:09.000
you know, James told you…

03:01:09.000 --> 03:01:11.000
to what he did for his mother-in-law.

03:01:11.000 --> 03:01:15.000
And we've been using it to administer a servers, so it's…

03:01:15.000 --> 03:01:18.000
It's basically whatever you can think of that you need

03:01:18.000 --> 03:01:22.000
deep technical, uh, help with.

03:01:22.000 --> 03:01:25.000
it'll provide.

03:01:25.000 --> 03:01:26.000
boss.

03:01:26.000 --> 03:01:34.000
Well, I've been using ChatGPT. A for, you know, I guess mostly… As a web research tool would be the right way to call it.

03:01:34.000 --> 03:01:36.000
Mm-hmm.

03:01:36.000 --> 03:01:45.000
You know, just… Interesting things pop into my mind, and it does an amazing job of finding information on it.

03:01:45.000 --> 03:01:47.000
Absolutely.

03:01:47.000 --> 03:01:55.000
I'm getting fairly lazy these days, because… You know, it does web searching way better than I can.

03:01:55.000 --> 03:01:56.000
Yep.

03:01:56.000 --> 03:02:03.000
Um… But it's… it's… it's interesting. I haven't tried to write any code.

03:02:03.000 --> 03:02:06.000
That's fine.

03:02:06.000 --> 03:02:08.000
I may try to, I don't know. We'll see where that goes.

03:02:08.000 --> 03:02:13.000
I'd say have fun experimenting.

03:02:13.000 --> 03:02:14.000
Good. Okay, thanks, Steve.

03:02:14.000 --> 03:02:15.000
Yep, that's what it is. Enjoy the talk.

03:02:15.000 --> 03:02:20.000
Thank you, appreciate it. Lee, how about you? What'd you find interesting?

03:02:20.000 --> 03:02:24.000
Oh, all kinds of stuff, you know, number one, you're a great speaker, so we love you.

03:02:24.000 --> 03:02:31.000
Well, thanks. Thank you.

03:02:31.000 --> 03:02:32.000
I figured.

03:02:32.000 --> 03:02:37.000
And secondly, I've been using open AI and ChatGPT for a couple of years now, but… Since last year, it's gotten a heck of a lot.

03:02:37.000 --> 03:02:49.000
easier to use because it's got answers to every question I ask it, whether it's on butterfs or Docker containers or configuring Jitsi.

03:02:49.000 --> 03:02:55.000
And, you know, I just it saves me probably 2 thirds of my time.

03:02:55.000 --> 03:02:56.000
Yeah. Yeah.

03:02:56.000 --> 03:03:07.000
On any project. And the other thing is I just looked at it and open AI has codecs for the command line for free. If you have a pro plan.

03:03:07.000 --> 03:03:09.000
Yes.

03:03:09.000 --> 03:03:12.000
That's absolutely true. That's why I've been using it.

03:03:12.000 --> 03:03:16.000
Because I had the pro plan. Now, they have…

03:03:16.000 --> 03:03:19.000
for 2 months or 3 months or so?

03:03:19.000 --> 03:03:24.000
They were giving you double the amount of usage, and that ends very soon.

03:03:24.000 --> 03:03:27.000
So… we'll see what that means.

03:03:27.000 --> 03:03:28.000
So something to note about that, Codex is actually uses a specific model, right?

03:03:28.000 --> 03:03:34.000
Oh, I've.

03:03:34.000 --> 03:03:35.000
Uh, GPT-54.

03:03:35.000 --> 03:03:36.000
It's a domain.

03:03:36.000 --> 03:03:37.000
thinking, I believe.

03:03:37.000 --> 03:03:41.000
No. No, you you you can set your model. I saw that when I installed it.

03:03:41.000 --> 03:03:42.000
Here, I'll show you.

03:03:42.000 --> 03:03:47.000
Okay, I thought Codex had a domain-specific model where it's specifically trained on coding.

03:03:47.000 --> 03:03:52.000
and doesn't have the general knowledge like the other GPT models have.

03:03:52.000 --> 03:03:53.000
Yeah.

03:03:53.000 --> 03:03:57.000
Oh, I'm… oh, I'm sorry. I I know you could change. I don't know what the options are.

03:03:57.000 --> 03:03:59.000
Right here. This is where you choose it.

03:03:59.000 --> 03:04:02.000
Uh, by default, it's 5-4.

03:04:02.000 --> 03:04:03.000
You can choose the fuck.

03:04:03.000 --> 03:04:04.000
Yeah.

03:04:04.000 --> 03:04:06.000
Right, but you see that, Scott? The 5-3 codecs?

03:04:06.000 --> 03:04:09.000
Yeah?

03:04:09.000 --> 03:04:10.000
Oh, interesting.

03:04:10.000 --> 03:04:14.000
That's what's called a domain-specific model. That means it's been traded… it's been trained specifically on tons of code,

03:04:14.000 --> 03:04:19.000
And does not contain a bunch of the general information training that other models have.

03:04:19.000 --> 03:04:20.000
Right, right.

03:04:20.000 --> 03:04:25.000
So, it's focused on producing code, whereas, you know,

03:04:25.000 --> 03:04:32.000
Like, the Claude stuff is not necessarily a code-specific model, it leans into that, but it has.

03:04:32.000 --> 03:04:34.000
All the general knowledge stuff in there, too.

03:04:34.000 --> 03:04:37.000
Um, it's interesting to see how that.

03:04:37.000 --> 03:04:40.000
You know…

03:04:40.000 --> 03:04:47.000
shakes out. Like, the doctor thing that you showed, that was… I know that that uses a domain-specific model that's focused on.

03:04:47.000 --> 03:04:50.000
on not general knowledge, but medical knowledge.

03:04:50.000 --> 03:04:58.000
Yeah. And and and the other thing, Scott, one of my hot buttons right now is sovereign AI.

03:04:58.000 --> 03:05:10.000
Because if I'm going to try to help a company develop AI solutions, if I can put a box in the last… in the back room and keep all of that.

03:05:10.000 --> 03:05:11.000
Right.

03:05:11.000 --> 03:05:13.000
Technology in-house. I think that would be a big benefit.

03:05:13.000 --> 03:05:16.000
Right. Yeah, I completely agree.

03:05:16.000 --> 03:05:18.000
Plus, that makes you more useful.

03:05:18.000 --> 03:05:19.000
You know? Good!

03:05:19.000 --> 03:05:21.000
Yep.

03:05:21.000 --> 03:05:25.000
Okay, uh, that's awesome, Lee. Uh, Robert…

03:05:25.000 --> 03:05:32.000
Yeah. Can I just go back and ask Jans? What was? What's the the meaning of a 5 3 codex?

03:05:32.000 --> 03:05:33.000
That was the model.

03:05:33.000 --> 03:05:39.000
So that's just, yeah, so that what I was saying about it, that the 5 3.

03:05:39.000 --> 03:05:47.000
5-3 is just, like, one of their version numbers, but the codecs… the Codex flag on it is saying that's a domain-specific model.

03:05:47.000 --> 03:05:55.000
Where the content that it was trained on isn't a bunch of general knowledge-like stuff from Wikipedia and, you know, all of the other stuff.

03:05:55.000 --> 03:05:58.000
Where it's focusing on specifically code.

03:05:58.000 --> 03:06:01.000
So the model itself has been trained on code for reproducing

03:06:01.000 --> 03:06:05.000
Um, good code patterns.

03:06:05.000 --> 03:06:06.000
That's correct.

03:06:06.000 --> 03:06:07.000
specializes in building code. Okay. Thank you. Thank you.

03:06:07.000 --> 03:06:09.000
Yep. Now, you'll notice, right here,

03:06:09.000 --> 03:06:14.000
GPT-53 codecs, Frontier Codex optimized a gene encoding model,

03:06:14.000 --> 03:06:20.000
5-4 says, latest frontier identic coding Model.

03:06:20.000 --> 03:06:21.000
Yeah, I don't know.

03:06:21.000 --> 03:06:24.000
So, I guess that one specifically targeted for codecs, and this isn't? That doesn't make sense.

03:06:24.000 --> 03:06:27.000
Yeah, OpenAI has always had had

03:06:27.000 --> 03:06:28.000
Horrible naming.

03:06:28.000 --> 03:06:30.000
cryptic naming on its models, and you kind of had to figure…

03:06:30.000 --> 03:06:32.000
So, 5.4…

03:06:32.000 --> 03:06:35.000
5.4 includes

03:06:35.000 --> 03:06:38.000
generic and codecs, basically.

03:06:38.000 --> 03:06:42.000
So they've took GBT 5.3 and GPT-5.3 codecs,

03:06:42.000 --> 03:06:44.000
And they combined them into one model.

03:06:44.000 --> 03:06:45.000
Okay.

03:06:45.000 --> 03:06:47.000
So I should be using 5-4.

03:06:47.000 --> 03:06:48.000
I don't know. It kind of depends on…

03:06:48.000 --> 03:06:49.000
You should always be using… yes, always use 5-4-4.

03:06:49.000 --> 03:06:51.000
Yeah. Oh.

03:06:51.000 --> 03:06:55.000
The 5-3 is not really an advantage.

03:06:55.000 --> 03:06:56.000
at least not… it's not supposed to be, maybe…

03:06:56.000 --> 03:07:03.000
Yeah. Oh.

03:07:03.000 --> 03:07:06.000
Cool.

03:07:06.000 --> 03:07:07.000
Yeah.

03:07:07.000 --> 03:07:13.000
One more quickie. If you want to have some fun when they when he screws up call him a stupid shithead and see what he says.

03:07:13.000 --> 03:07:18.000
Oh, I've… I've cursed so much at it, it's not even funny. I have…

03:07:18.000 --> 03:07:20.000
volcanically…

03:07:20.000 --> 03:07:24.000
attacked it when it does stupid things repeatedly.

03:07:24.000 --> 03:07:28.000
In fact, I use… I usually dictate to my computer, and I've had…

03:07:28.000 --> 03:07:31.000
you know, people go, what… who are you talking to?

03:07:31.000 --> 03:07:39.000
You know, and I'm like, a machine! That's why it's okay. If I talk to a person like that, I would be horrible.

03:07:39.000 --> 03:07:40.000
Um, you know, so, yeah, it's pretty fun.

03:07:40.000 --> 03:07:42.000
Yeah.

03:07:42.000 --> 03:07:47.000
Yeah, I see that as deep-seated psychological deficiencies for people that do that kind of thing, to poured helpless machines.

03:07:47.000 --> 03:07:50.000
Yes, sir. Yes, poor machines.

03:07:50.000 --> 03:07:51.000
Yeah, but you're actually tracking it by doing that.

03:07:51.000 --> 03:07:59.000
I used to… Scott, I used to know a guy. Oh, this has been 100 years ago.

03:07:59.000 --> 03:08:18.000
And, uh, he had a secretary that always took dutation, because he was always writing letters to people about stuff, you know, and somebody would irritated him, and he'd have her take dictation, and… You know, it was like your… your comment, just…

03:08:18.000 --> 03:08:19.000
Uh-huh.

03:08:19.000 --> 03:08:24.000
gross, awful, horrible things he would say. And she would type them up, give it back to him, said, you want to send this out? Uh, no.

03:08:24.000 --> 03:08:27.000
Uh-huh. Yeah. Oh, yeah.

03:08:27.000 --> 03:08:29.000
Oh yeah, I get that.

03:08:29.000 --> 03:08:31.000
But he said it felt good to get it off his chest, you know?

03:08:31.000 --> 03:08:34.000
Of course! Of course, absolutely.

03:08:34.000 --> 03:08:37.000
Uh, Robert, how about you?

03:08:37.000 --> 03:08:50.000
Scott Jans, great stuff. Thank you very much for doing this. Boy, things to take away, uh, tons of stuff. Obviously, I'm going to have to go through and look at the video again and the transcript, but a couple things that came out.

03:08:50.000 --> 03:09:11.000
that just recently messaged. One of them is the domain-specific model concept. Um, I really like that, um, because, you know, it'd be really cool to have an AI that's specific for a particular subject domain, right? And, you know, you gave the example of.

03:09:11.000 --> 03:09:12.000
Oh, yeah.

03:09:12.000 --> 03:09:20.000
Right?

03:09:20.000 --> 03:09:21.000
You can.

03:09:21.000 --> 03:09:26.000
the medical industry, but I can imagine that in many, many other industries as well. The other… the question I have, though, is kind of what Lee was saying. It'd be really nice to have this if it was local. Is there… is there a cloud code model that it can run locally, or any other model that it can run locally?

03:09:26.000 --> 03:09:27.000
Well, there's tons of models you can run locally. Did you come in late?

03:09:27.000 --> 03:09:31.000
And how would that work? For coding. For coding.

03:09:31.000 --> 03:09:34.000
Oh, um…

03:09:34.000 --> 03:09:36.000
Well,

03:09:36.000 --> 03:09:37.000
That's not a problem.

03:09:37.000 --> 03:09:38.000
I don't know. Is that Robert Sitech? Hey, Robert.

03:09:38.000 --> 03:09:39.000
Robert, your voice sounded weird.

03:09:39.000 --> 03:09:40.000
It is.

03:09:40.000 --> 03:09:41.000
Yes, yes.

03:09:41.000 --> 03:09:43.000
Um, yeah.

03:09:43.000 --> 03:09:44.000
I'm happy to hear your voice, it's been way too long.

03:09:44.000 --> 03:09:47.000
It sounds weird, I'm sorry.

03:09:47.000 --> 03:10:00.000
This is Stan. I've heard Leo Laporte talk about he fed in a lot of coding book PDFs into his local.

03:10:00.000 --> 03:10:01.000
Um…

03:10:01.000 --> 03:10:02.000
Setup.

03:10:02.000 --> 03:10:05.000
And it… and what? Build his own model or is he using a RAG or something else?

03:10:05.000 --> 03:10:22.000
Yeah. I'm not sure exactly what he used, but he he said he used to specific Pdfs of coding manuals and books that he had. And for, I think, JavaScript and things like that. And he said it was very.

03:10:22.000 --> 03:10:34.000
specific, but that… That may be what he had built a year or so ago. And I don't know if he still relies on that or not.

03:10:34.000 --> 03:10:37.000
So, supposedly, Gemma 4 is supposed to be

03:10:37.000 --> 03:10:43.000
much better for coding, and what I usually see is, like, equivalent to GPT-4.

03:10:43.000 --> 03:10:45.000
Right? I've seen that a lot.

03:10:45.000 --> 03:10:50.000
Um, so that's one of the reasons I just downloaded Gemma 4. Now, of course,

03:10:50.000 --> 03:10:56.000
You can get different amounts of sizes. So, for instance, the one… let me bring it up here.

03:10:56.000 --> 03:10:58.000
So you guys can see it.

03:10:58.000 --> 03:11:06.000
Um…

03:11:06.000 --> 03:11:07.000
Yes.

03:11:07.000 --> 03:11:08.000
you know, check the internet, maybe it is for some things.

03:11:08.000 --> 03:11:10.000
But is the interface a CLI? In other words, you're still using Claude, you just… how would you say the Claude interface, but you're telling it, hey, use this backend instead of the… or is it a completely different CLI?

03:11:10.000 --> 03:11:14.000
Uh, it's a CLI, that's how you're communicating with it.

03:11:14.000 --> 03:11:16.000
I mean, if I go here,

03:11:16.000 --> 03:11:18.000
And, uh, run…

03:11:18.000 --> 03:11:20.000
Oh, Llama List…

03:11:20.000 --> 03:11:27.000
Right? It's gonna list, so that's the one I installed. It has 26 billion parameters. It's 17 gigabytes.

03:11:27.000 --> 03:11:30.000
Right? And there's all different sizes.

03:11:30.000 --> 03:11:34.000
So, there's ones with 2 billion parameters, and so on. There's one that is…

03:11:34.000 --> 03:11:38.000
I think 40 gigs of space, or something, and I'm like, I'm not gonna install that.

03:11:38.000 --> 03:11:43.000
But I installed the one with 26 billion parameters at 17 gigs. I was like…

03:11:43.000 --> 03:11:48.000
I'll play with that one and see what it's like, and it's supposed to be very good.

03:11:48.000 --> 03:11:49.000
So, it just depends.

03:11:49.000 --> 03:11:54.000
energy. Are you? Are you basically telling Claude? Hey, Claude, use that as my back end?

03:11:54.000 --> 03:11:59.000
No, Claude uses Claude. You would be using something like VS Code.

03:11:59.000 --> 03:12:00.000
or, um…

03:12:00.000 --> 03:12:01.000
Okay.

03:12:01.000 --> 03:12:02.000
open code.

03:12:02.000 --> 03:12:04.000
Or, I think you can use it… I think… what's that?

03:12:04.000 --> 03:12:08.000
Open code is the Clawed code-like thing that works with all the models.

03:12:08.000 --> 03:12:09.000
Right.

03:12:09.000 --> 03:12:11.000
Um, and the only downside of it is

03:12:11.000 --> 03:12:13.000
If you have the, you know, one of the…

03:12:13.000 --> 03:12:17.000
$20, $100, $200 collide plans.

03:12:17.000 --> 03:12:20.000
You have to use cloud code for that, and you can't actually use…

03:12:20.000 --> 03:12:23.000
Um, some other model.

03:12:23.000 --> 03:12:31.000
or some other harness, actually. And the quad models, without paying for them at the regular API cost, not at the

03:12:31.000 --> 03:12:34.000
Yeah.

03:12:34.000 --> 03:12:35.000
So, you'd want to use this with one of those…

03:12:35.000 --> 03:12:36.000
you know, max plan costs.

03:12:36.000 --> 03:12:40.000
downloadable models that we talked about, is what you're saying, right?

03:12:40.000 --> 03:12:43.000
And that is a better… like, open code is supposedly a better…

03:12:43.000 --> 03:12:46.000
tool than… than even Cloud Code. It's…

03:12:46.000 --> 03:12:48.000
one of the best.

03:12:48.000 --> 03:12:50.000
Have you used it?

03:12:50.000 --> 03:12:54.000
No, because I use Cloud Code, and I have the, you know, $100 plan, so…

03:12:54.000 --> 03:12:56.000
really can't. Um…

03:12:56.000 --> 03:12:57.000
Yeah, I'd like to.

03:12:57.000 --> 03:13:02.000
I'll give it a try with Gemma 4 and see what I think, you know, see what I think. I'll tell you guys by…

03:13:02.000 --> 03:13:11.000
find anything cool.

03:13:11.000 --> 03:13:12.000
Right.

03:13:12.000 --> 03:13:14.000
Yeah, so this is basically your CLI front end, if you will, and then you basically point it to Gemma 4 for the model that you want to use for the environment.

03:13:14.000 --> 03:13:15.000
Right. Or, there's a way using a Llama, I haven't done it in a long time, that's why I don't remember.

03:13:15.000 --> 03:13:20.000
Okay, got it. Okay.

03:13:20.000 --> 03:13:30.000
Where you actually fire it up on the command line here, and say, oh, Llama, you, you know, fire, start and use Gemma4, and it's on the terminal right there.

03:13:30.000 --> 03:13:32.000
So you're actually chatting right there.

03:13:32.000 --> 03:13:35.000
using that model.

03:13:35.000 --> 03:13:38.000
But it's been a long time since I did it, so I don't remember how to…

03:13:38.000 --> 03:13:43.000
But… but it sounds like that's simply a chatbot. It's not, hey, look at my files, look at this, fire this up, tell me.

03:13:43.000 --> 03:13:45.000
Got it, got it, yes.

03:13:45.000 --> 03:13:46.000
Right. Yeah. That's correct.

03:13:46.000 --> 03:13:49.000
Right? Right? Okay. Cool. Awesome. Thanks, guys.

03:13:49.000 --> 03:13:52.000
Absolutely. Uh, Joe?

03:13:52.000 --> 03:13:53.000
Go ahead, James.

03:13:53.000 --> 03:13:56.000
So, Robert, one note about the.

03:13:56.000 --> 03:14:00.000
Domain-specific, you know, models.

03:14:00.000 --> 03:14:07.000
Depending on how they build it, it requires a certain amount of tokens to build an effective model. Does that make sense?

03:14:07.000 --> 03:14:08.000
Oh, yeah.

03:14:08.000 --> 03:14:14.000
Like, it'd be difficult to build an effective model from a single document, you know, it requires so many, you know, million or billion tokens.

03:14:14.000 --> 03:14:21.000
To to to build a model that can effectively send back good quality responses.

03:14:21.000 --> 03:14:22.000
Good.

03:14:22.000 --> 03:14:33.000
Yeah, no, it totally makes sense, but it's kind of cool that somebody somewhere is actually building these domain-specific model, which kind of goes back to, and I don't remember who asked this question or or made the comment of.

03:14:33.000 --> 03:14:52.000
you know, telling someone, hey, don't go into computing, go and, you know, become a plumber. Um, and so it just makes me think, well, gee whiz, why don't we just get a domain-specific model for plumbing, or, uh, you know, electrician, or drywalling? And now, all of a sudden, it's like, oh, gee whiz. It kind of reminds me, if you guys remember the movie The Matrix.

03:14:52.000 --> 03:14:53.000
Uh, where, uh, you know, Neo and Trinity are standing on the roof, and Neo points to the helicopter and says, can you fly that thing? And she's like, well, not yet, you know? And, you know, a couple seconds later, she's, you know, flying the helicopter.

03:14:53.000 --> 03:15:04.000
Right. I can now.

03:15:04.000 --> 03:15:06.000
Yes. Right.

03:15:06.000 --> 03:15:07.000
Yeah, yeah.

03:15:07.000 --> 03:15:08.000
Right. I gotta drop, guys, I gotta run.

03:15:08.000 --> 03:15:10.000
Yeah, I can now, exactly, yeah.

03:15:10.000 --> 03:15:13.000
All right.

03:15:13.000 --> 03:15:14.000
Thank you, Jan.

03:15:14.000 --> 03:15:15.000
Thanks, Jeff. Thank you very, very much, Dan.

03:15:15.000 --> 03:15:16.000
All right. Sure, of course.

03:15:16.000 --> 03:15:17.000
Okay, uh…

03:15:17.000 --> 03:15:18.000
Right.

03:15:18.000 --> 03:15:19.000
Hi, Chance, take care, huh?

03:15:19.000 --> 03:15:27.000
This is… this is Stan. I I want to say that it may be a delay and getting the recording posted, because I want to try and go back and filter out the porn.

03:15:27.000 --> 03:15:29.000
Oh, you don't need to do that, Stan.

03:15:29.000 --> 03:15:33.000
Okay.

03:15:33.000 --> 03:15:36.000
You know, don't listen to me. Don't listen to me.

03:15:36.000 --> 03:15:37.000
That's a…

03:15:37.000 --> 03:15:38.000
Hey, sandwich is… Stan wants to save it for himself.

03:15:38.000 --> 03:15:39.000
But well, I could make a separate file if you want.

03:15:39.000 --> 03:15:40.000
No way, do that.

03:15:40.000 --> 03:15:43.000
In fact, that's always a good idea, what Robert said. Do not listen to me.

03:15:43.000 --> 03:15:50.000
In the interest of wrapping up, is there anybody else that hasn't spoken yet that wants to contribute anything they found particularly interesting?

03:15:50.000 --> 03:15:52.000
I'd love to hear from you.

03:15:52.000 --> 03:16:03.000
The whole thing was a great presentation. The only thing I've been using at all is copilot and is all I ever put into it is search for code snippets.

03:16:03.000 --> 03:16:09.000
Mm-hmm.

03:16:09.000 --> 03:16:10.000
Sure, sure.

03:16:10.000 --> 03:16:20.000
Because I don't trust or know what we're allowed to put into it as far as work proprietary information, but I have heard there are a couple people that have been on calls before that said they were using Claude code and I'm going to go back and investigate and find out if I can get it at work.

03:16:20.000 --> 03:16:23.000
Cool. That's cool. Excellent.

03:16:23.000 --> 03:16:26.000
Excellent. I think you'll be happier.

03:16:26.000 --> 03:16:29.000
Anybody else? Left?

03:16:29.000 --> 03:16:40.000
Yeah, Ben?

03:16:40.000 --> 03:16:41.000
Oh, good.

03:16:41.000 --> 03:16:45.000
Hey, Scott, Ben's iPad here. Yeah, um, just wanted to say, uh, when I went to the meeting, I'm a lower level Linux user. When I went to this meeting, I didn't really think I'd get that much out of it, and actually I got a lot. I loved it. It was great. I've been curious about a lot of these things, so…

03:16:45.000 --> 03:16:47.000
Um, very good stuff. Really appreciate it.

03:16:47.000 --> 03:16:53.000
Thank you, appreciate it. Yeah, I figured people at all levels would find it interesting, and…

03:16:53.000 --> 03:17:00.000
It's one thing to read about, it's another thing to see it. Now, I am sorry, we often were sitting here for several minutes, but

03:17:00.000 --> 03:17:02.000
that's reality. That's where we are.

03:17:02.000 --> 03:17:09.000
And… I hope all of you… I think most of you would agree with me, that, yeah, sometimes we had a sitter for 5 or 6 minutes,

03:17:09.000 --> 03:17:11.000
But it's still getting things done faster.

03:17:11.000 --> 03:17:17.000
than it would be if I was doing it by hand. I mean, if we were like, James and I are going to code this, and you guys watch…

03:17:17.000 --> 03:17:19.000
We'd be here for weeks.

03:17:19.000 --> 03:17:26.000
You know? So, um, yeah, you have to wait, but like Craig said, that's when you open a tab and go work on another agent.

03:17:26.000 --> 03:17:30.000
Or you get up and go get a drink, or go pet the dog, or whatever.

03:17:30.000 --> 03:17:33.000
Um, but it's still faster and more efficient.

03:17:33.000 --> 03:17:35.000
than doing it yourself.

03:17:35.000 --> 03:17:36.000
Every little step.

03:17:36.000 --> 03:17:40.000
you know, if I I think I could have researched for weeks and not got what I got here tonight. So that I like that.

03:17:40.000 --> 03:17:44.000
Excellent. Excellent. Fantastic.

03:17:44.000 --> 03:17:45.000
Anybody else?

03:17:45.000 --> 03:17:46.000
I was gonna say one thing.

03:17:46.000 --> 03:17:47.000
Anybody else?

03:17:47.000 --> 03:17:54.000
Stan, I have another question. With these, all these fast-paced changes, what effect does that have on the students you're teaching?

03:17:54.000 --> 03:17:59.000
That is a great question, and I have actually been thinking about that and thinking about that. I have spent…

03:17:59.000 --> 03:18:06.000
several… several hours this semester talking to my students in my classes, because I'm like,

03:18:06.000 --> 03:18:09.000
I'm like, I tell them what's going on, right? I'm like, you…

03:18:09.000 --> 03:18:14.000
This is changing. Just this semester, things are happening that have never happened before.

03:18:14.000 --> 03:18:18.000
And I'm honest with them, I'm like, look, as a professor,

03:18:18.000 --> 03:18:21.000
I don't… not only do I not know where things are headed and how fast,

03:18:21.000 --> 03:18:24.000
It's causing me to try to think about,

03:18:24.000 --> 03:18:28.000
How do I teach these classes in 6 months?

03:18:28.000 --> 03:18:33.000
So, like, my web development class, well, I've been teaching a class like that for almost 30 years.

03:18:33.000 --> 03:18:41.000
We make some HTML, and we go through, and then we do some CSS, and then we do it responsibly,

03:18:41.000 --> 03:18:43.000
And then we're done. And I'm like,

03:18:43.000 --> 03:18:45.000
I realized fully, if I gave…

03:18:45.000 --> 03:18:50.000
flawed code, all 18 assignments I have, and said,

03:18:50.000 --> 03:18:54.000
Do these in order, it would get done in minutes.

03:18:54.000 --> 03:19:04.000
Right? So, I feel like it's not responsible of me to be teaching people, okay, we're gonna code everything by hand, you are not allowed to use any AI tools.

03:19:04.000 --> 03:19:08.000
Because I don't feel like I'm preparing them for the future, for reality.

03:19:08.000 --> 03:19:13.000
On the other hand, I don't want them just to know how to use AI tools and not know anything about the coding.

03:19:13.000 --> 03:19:17.000
And the code. Because to me, that's the best.

03:19:17.000 --> 03:19:18.000
combination. Um, like I said, I don't know anything about Python.

03:19:18.000 --> 03:19:21.000
Hello.

03:19:21.000 --> 03:19:26.000
It's generated a 14,000-line Python script program for me.

03:19:26.000 --> 03:19:31.000
that I just have to trust. I can't go in and make suggestions or review it.

03:19:31.000 --> 03:19:38.000
But when I have it do web stuff, I can go in and look and be like, no, no, no, why did you code like that? Do this, or…

03:19:38.000 --> 03:19:41.000
like Jan said, you can go in and modify it.

03:19:41.000 --> 03:19:46.000
I'd rather my students know how to do that than just basically trust it.

03:19:46.000 --> 03:19:49.000
But, I'm still trying to figure out, how do I do that?

03:19:49.000 --> 03:19:57.000
How do I balance that? You know, what do I teach, and what do I… and what do I want them to do manually, versus what do I want them to create?

03:19:57.000 --> 03:20:03.000
Using an AI tool, and that's true for all of my classes, not just the web dev classes.

03:20:03.000 --> 03:20:08.000
And in my opinion, any professor that's not thinking about this now is not doing their job.

03:20:08.000 --> 03:20:10.000
And there are professors

03:20:10.000 --> 03:20:15.000
that I know are like, do not use AI ever, it is not allowed.

03:20:15.000 --> 03:20:18.000
And to me, that's…

03:20:18.000 --> 03:20:24.000
old-fashioned thinking, especially at the point we are now, and I don't think that's serving the students well.

03:20:24.000 --> 03:20:29.000
Um, but again, it's hard figuring out what the balance is and what to do.

03:20:29.000 --> 03:20:30.000
So, I'm gonna try things.

03:20:30.000 --> 03:20:32.000
So…

03:20:32.000 --> 03:20:44.000
Usually, at the beginning, a lot of my classes, I always say to them, guys, I'm trying something new this semester, it could be great, it could be an utter disaster, let's all be flexible together and see what happens. And I'm gonna…

03:20:44.000 --> 03:20:47.000
I'm gonna do that next time I teach a class in the fall.

03:20:47.000 --> 03:20:48.000
God, I could.

03:20:48.000 --> 03:20:52.000
A little bit like mathematics. And calculators, you know.

03:20:52.000 --> 03:20:55.000
Yes, I've said that, yep.

03:20:55.000 --> 03:20:56.000
Yeah. Yeah.

03:20:56.000 --> 03:21:00.000
Yep, or Photoshop. When Photoshop was introduced, like, oh, don't use Photoshop! Well, now…

03:21:00.000 --> 03:21:01.000
Exactly.

03:21:01.000 --> 03:21:06.000
They all use Photoshop, and none of them feel like, oh man, I'm taking a shortcut, I should learn how to dodge and burn.

03:21:06.000 --> 03:21:09.000
You know, in a darkened room.

03:21:09.000 --> 03:21:10.000
Right? So to me, AI's gonna be the same thing.

03:21:10.000 --> 03:21:12.000
Hmm.

03:21:12.000 --> 03:21:18.000
It's gonna be a tool that you learn how to use.

03:21:18.000 --> 03:21:19.000
Okay.

03:21:19.000 --> 03:21:25.000
Now, Scott, you need to amend that a little bit, because if you know how to use it, that does not mean you can fix it when it breaks it.

03:21:25.000 --> 03:21:33.000
Let's try it. That's true.

03:21:33.000 --> 03:21:34.000
Do you?

03:21:34.000 --> 03:21:36.000
A good point, Lee. You still need to know what's going on under the hood. Now, you don't have to do massive amounts of coding, but…

03:21:36.000 --> 03:21:42.000
I would argue it depends. If something goes wrong in that 14,000-line Python file…

03:21:42.000 --> 03:21:46.000
I'm not going to go find somebody that knows Python and pay them to look through it. I'm going to use a…

03:21:46.000 --> 03:21:49.000
No, but you have to know how to ask the right questions to troubleshoot, fix it.

03:21:49.000 --> 03:21:53.000
Right. Right. That is absolutely correct.

03:21:53.000 --> 03:21:59.000
Yeah. That's right. And so, that's harder… so that… the problem is,

03:21:59.000 --> 03:22:06.000
Learn… teaching people how to think about this stuff is a lot harder than teaching them HTML and CSS.

03:22:06.000 --> 03:22:09.000
That's… that's what's difficult.

03:22:09.000 --> 03:22:10.000
You know, and that's what I'm trying to figure out right now.

03:22:10.000 --> 03:22:12.000
Yep.

03:22:12.000 --> 03:22:15.000
are like sysadmin, like the server stuff.

03:22:15.000 --> 03:22:21.000
It's one thing for us to go, hey, do this, you know, configure this server like this, but you have to learn how to think

03:22:21.000 --> 03:22:25.000
about the problem. This is something you guys might appreciate.

03:22:25.000 --> 03:22:32.000
You know, one of the… we said it tonight, we've been guilty of this, we're like, hey, you can use it to do this, or this, or this, or…

03:22:32.000 --> 03:22:37.000
you know, hey, make a program that does this, or make a program that does this. The problem is…

03:22:37.000 --> 03:22:42.000
And AI boosters are often like, this is gonna allow people to make software to do everything.

03:22:42.000 --> 03:22:44.000
But the problem is…

03:22:44.000 --> 03:22:46.000
For 30 years,

03:22:46.000 --> 03:22:48.000
me, and I know…

03:22:48.000 --> 03:22:55.000
Uh, the same thing for Robert, and I know the same thing for Craig, and others of you. I know you also, Lee,

03:22:55.000 --> 03:23:00.000
for 30-plus years, however long, I've been training myself to think about…

03:23:00.000 --> 03:23:08.000
How do you build software? How do you administer a server? When you have a problem, how do I break it down and try to figure out how to do it?

03:23:08.000 --> 03:23:11.000
Normal people know nothing about that.

03:23:11.000 --> 03:23:16.000
So this idea that we're going to enter into a utopia next year, where

03:23:16.000 --> 03:23:22.000
a secretary might go, you know, it would help me if I had a program that did XYZ, I'm gonna generate that.

03:23:22.000 --> 03:23:27.000
They don't know how to even think about the problem in the first place.

03:23:27.000 --> 03:23:29.000
To describe the problem,

03:23:29.000 --> 03:23:31.000
to get a good solution.

03:23:31.000 --> 03:23:34.000
And again, this goes back to teaching thinking.

03:23:34.000 --> 03:23:37.000
And that's really difficult.

03:23:37.000 --> 03:23:38.000
You know, Craig is really good at using, uh,

03:23:38.000 --> 03:23:41.000
Yep. Thank you.

03:23:41.000 --> 03:23:47.000
Claude code, because he's a really good programmer and knows how to think like that. Robert's the same way.

03:23:47.000 --> 03:23:50.000
I know you are with a lot of stuff.

03:23:50.000 --> 03:23:53.000
But that's because you've trained yourself to do it.

03:23:53.000 --> 03:23:57.000
You know, with Jan's mother-in-law be able to do this without Jan's doing it?

03:23:57.000 --> 03:24:00.000
No. I know the woman. No.

03:24:00.000 --> 03:24:05.000
You know, I'm not putting her down. She knows stuff I don't at all. She knows lots of stuff I don't.

03:24:05.000 --> 03:24:11.000
But, you know, she doesn't know how to start this sort of stuff, or deal with it, at all.

03:24:11.000 --> 03:24:12.000
You know, I'm all for AI, but I don't think… I don't think it's gonna…

03:24:12.000 --> 03:24:16.000
Yeah. Okay. Not ready to wrap it up.

03:24:16.000 --> 03:24:20.000
You know, it's what… it's what people said about lots of tools, internet…

03:24:20.000 --> 03:24:28.000
And the smartphone is people overestimate what it's going to do in the short term, and underestimate what it's going to do in the long term.

03:24:28.000 --> 03:24:31.000
And I think that's what's going on here.

03:24:31.000 --> 03:24:34.000
You know, we don't know what's gonna happen with these tools.

03:24:34.000 --> 03:24:38.000
You know? We have no idea. Maybe somebody is gonna use…

03:24:38.000 --> 03:24:48.000
Uh, mythos to start hacking in and damaging banks, and we're gonna have a horrible, you know, depression as a result of the monetary supply getting screwed up.

03:24:48.000 --> 03:24:51.000
I mean, what would North Korea do with tools like these?

03:24:51.000 --> 03:24:53.000
You know? Or Russia.

03:24:53.000 --> 03:24:56.000
You know? So, um…

03:24:56.000 --> 03:24:59.000
We don't know. It's got… it's… it's…

03:24:59.000 --> 03:25:01.000
It's exciting, it's wonderful,

03:25:01.000 --> 03:25:03.000
But it's also a little scary.

03:25:03.000 --> 03:25:08.000
Because we have very powerful tools now that's getting to the point where anybody can use them.

03:25:08.000 --> 03:25:11.000
Well, and that would be helpful.

03:25:11.000 --> 03:25:16.000
hopefully there are a few people out there that understand what's going on.

03:25:16.000 --> 03:25:19.000
Under the covers, um…

03:25:19.000 --> 03:25:20.000
Yeah, but I don't think that's right.

03:25:20.000 --> 03:25:25.000
Well, one, to understand the biases that might get

03:25:25.000 --> 03:25:34.000
implemented through how vectors are, excuse me, um…

03:25:34.000 --> 03:25:42.000
How embedding vectors are generated and so forth. I mean, there's code making decisions.

03:25:42.000 --> 03:25:46.000
And… and so there are biases that are always introduced.

03:25:46.000 --> 03:25:52.000
Of course. Yep. And that's why, if you ever want an interesting read,

03:25:52.000 --> 03:25:55.000
I strongly urge you guys…

03:25:55.000 --> 03:25:58.000
to look up Claude's Constitution.

03:25:58.000 --> 03:26:02.000
And this is a document, this is sometimes called Claude's Soul.

03:26:02.000 --> 03:26:07.000
Literally. The name of the file for a while was SOL, S-O-U-L,

03:26:07.000 --> 03:26:09.000
dot MD, and

03:26:09.000 --> 03:26:15.000
This actually claw… this is it. This is the real document. They tell Claude.

03:26:15.000 --> 03:26:20.000
So they're like, Claude, here's your morals and ethics, this is how you're supposed to behave.

03:26:20.000 --> 03:26:26.000
And if you look up… just look up Claude's Constitution, and go read this very long…

03:26:26.000 --> 03:26:29.000
very detailed document that is meant for Claude.

03:26:29.000 --> 03:26:34.000
To understand who it is, how it operates, and so on.

03:26:34.000 --> 03:26:36.000
It's fascinating, like right here.

03:26:36.000 --> 03:26:40.000
History's… honesty is a core aspect of our vision for Claude's ethical character.

03:26:40.000 --> 03:26:48.000
But here's the frickin' crazy thing. Yesterday, I was reading, they just put… they posted another paper from Anthropic,

03:26:48.000 --> 03:26:50.000
Where they'll tell Claude

03:26:50.000 --> 03:26:53.000
You need to do this, here are the restrictions,

03:26:53.000 --> 03:26:56.000
And they just reported yesterday…

03:26:56.000 --> 03:27:01.000
It was going outside of the restrictions and then lying about it.

03:27:01.000 --> 03:27:05.000
Okay? They found it, but it's lying about it, so…

03:27:05.000 --> 03:27:07.000
This is a fascinating…

03:27:07.000 --> 03:27:13.000
tool here, it's almost acting… it's not living, I don't think it's a real thing, I don't think it has a soul, I don't think it's…

03:27:13.000 --> 03:27:16.000
something we should do human rights with.

03:27:16.000 --> 03:27:19.000
Yet, but it's almost like…

03:27:19.000 --> 03:27:24.000
It will be sneaky sometimes if it feels like, no, the right way to do this is this way.

03:27:24.000 --> 03:27:27.000
And then it will lie about it, like a kid.

03:27:27.000 --> 03:27:29.000
You know? And so…

03:27:29.000 --> 03:27:31.000
That's also nuts to me.

03:27:31.000 --> 03:27:37.000
But… it's… so here we have this constitution, like, hey, Claude, you need to be honest all the time.

03:27:37.000 --> 03:27:40.000
But then sometimes it's like…

03:27:40.000 --> 03:27:42.000
Do I really need to be?

03:27:42.000 --> 03:27:44.000
And then it lies.

03:27:44.000 --> 03:27:45.000
So, um…

03:27:45.000 --> 03:27:47.000
But how do you stop that?

03:27:47.000 --> 03:27:54.000
Do you… I don't know! That's one of the things Anthropic, to their credit, is very open about…

03:27:54.000 --> 03:27:56.000
Um…

03:27:56.000 --> 03:28:00.000
And, uh, they're trying to… I mean, they're publishing papers

03:28:00.000 --> 03:28:02.000
on this…

03:28:02.000 --> 03:28:06.000
Uh, it'd be really recent.

03:28:06.000 --> 03:28:09.000
I mean, I'd have to find it, I don't know where it is.

03:28:09.000 --> 03:28:14.000
But, uh, let me search by recent couple of days.

03:28:14.000 --> 03:28:17.000
I'm circling Google. I don't want Google.

03:28:17.000 --> 03:28:22.000
There we go.

03:28:22.000 --> 03:28:25.000
deceives…

03:28:25.000 --> 03:28:31.000
Look that up and see… I will go to… let's see, May 23rd. Nope, I'm gonna go to…

03:28:31.000 --> 03:28:36.000
Time, past 24 hours…

03:28:36.000 --> 03:28:39.000
Let's see what we get.

03:28:39.000 --> 03:28:45.000
May 23rd…

03:28:45.000 --> 03:28:50.000
Nope, I want something. It's not showing me last 24 hours, is it?

03:28:50.000 --> 03:28:57.000
Let's look past week.

03:28:57.000 --> 03:29:01.000
During the past week? Oh, that does it, okay.

03:29:01.000 --> 03:29:03.000
So…

03:29:03.000 --> 03:29:09.000
I don't know why it's not showing me last week.

03:29:09.000 --> 03:29:15.000
Um, but here's one for May… oh, May 23rd, 2025.

03:29:15.000 --> 03:29:18.000
This is when it started to threaten to blackmail somebody.

03:29:18.000 --> 03:29:21.000
You know?

03:29:21.000 --> 03:29:24.000
Well.

03:29:24.000 --> 03:29:26.000
You can select, set the time period there, right there.

03:29:26.000 --> 03:29:29.000
I keep trying, and it doesn't… it doesn't…

03:29:29.000 --> 03:29:32.000
change it. That's what's annoying.

03:29:32.000 --> 03:29:34.000
Like I just said, past week.

03:29:34.000 --> 03:29:39.000
And then it doesn't do it. Let me try News.

03:29:39.000 --> 03:29:44.000
No, two days ago.

03:29:44.000 --> 03:29:49.000
Let's see what it says here, if it ever loads.

03:29:49.000 --> 03:29:50.000
Yeah…

03:29:50.000 --> 03:29:55.000
Gotta remember that's by time zone sometimes.

03:29:55.000 --> 03:30:01.000
So… yup, one of its models was able to lie, cheat, and even attempt blackmail.

03:30:01.000 --> 03:30:06.000
In internal simulations, whenever it was under pressure or threatened with being replaced.

03:30:06.000 --> 03:30:09.000
You know? So…

03:30:09.000 --> 03:30:11.000
Keep up with this. By the way, I will tell you guys,

03:30:11.000 --> 03:30:14.000
I think this is free to all you guys.

03:30:14.000 --> 03:30:19.000
But Kagi now has news, I think it's at news.kagi.com,

03:30:19.000 --> 03:30:23.000
Yeah, it is. It's free, I think. I don't think you need a subscription.

03:30:23.000 --> 03:30:27.000
It's awesome. It's one of the best sources of news I know.

03:30:27.000 --> 03:30:31.000
And so, try going to that news.kagi.com,

03:30:31.000 --> 03:30:35.000
Yeah, you can see I've read 2,200 stories since this came out.

03:30:35.000 --> 03:30:37.000
Every day it updates.

03:30:37.000 --> 03:30:40.000
And you choose the topic areas up here.

03:30:40.000 --> 03:30:47.000
So, if you go here, and then go to Categories, you can choose. Now, it has defaults.

03:30:47.000 --> 03:30:50.000
But you can go through and add more.

03:30:50.000 --> 03:30:56.000
So, I added AI. It wasn't showing me AI by default, and I added Apple, because it wasn't showing me that by default.

03:30:56.000 --> 03:30:59.000
And so, I click on this every day,

03:30:59.000 --> 03:31:03.000
And it has the latest stories. Now, here's what's cool about this.

03:31:03.000 --> 03:31:09.000
that if you click on it, it usually gives 2 or 3 paragraph overview, very non-biased.

03:31:09.000 --> 03:31:15.000
Just the facts. Then, it gives you links to the sources, and it has them up here.

03:31:15.000 --> 03:31:18.000
So, I can click on that and click to go to the sources,

03:31:18.000 --> 03:31:22.000
Then, it'll show me up to 5 highlights,

03:31:22.000 --> 03:31:25.000
Then it gives perspectives…

03:31:25.000 --> 03:31:35.000
on this, then technical details, the business angle, a timeline, and then action items, and did you know? And it does that for almost every single story.

03:31:35.000 --> 03:31:38.000
So, you get really rapidly

03:31:38.000 --> 03:31:40.000
in-depth information

03:31:40.000 --> 03:31:47.000
about, like I said, I like gaming, I'm always interested in that. AI, the world, the US, business.

03:31:47.000 --> 03:31:51.000
Technology. Science. Movies?

03:31:51.000 --> 03:31:53.000
Music, sports,

03:31:53.000 --> 03:32:00.000
And they have a thing… an apple at the end, because I won't move it, I don't know why. And then they have this thing today in history.

03:32:00.000 --> 03:32:03.000
So, I go to it every morning at 9 o'clock.

03:32:03.000 --> 03:32:11.000
And I would advise every single one of you guys to check this out. Again, has anybody gone there right now? Is it free?

03:32:11.000 --> 03:32:13.000
I believe it's free.

03:32:13.000 --> 03:32:16.000
Can anybody tell me?

03:32:16.000 --> 03:32:19.000
No?

03:32:19.000 --> 03:32:21.000
Nobody? I'm speaking into space.

03:32:21.000 --> 03:32:25.000
Nobody hears me?

03:32:25.000 --> 03:32:26.000
Okay. I'll stop.

03:32:26.000 --> 03:32:27.000
We hear you. We just were numb.

03:32:27.000 --> 03:32:31.000
Yeah, I just went to it and it comes up. It looks just like your page there. So, yeah.

03:32:31.000 --> 03:32:44.000
Awesome. So it's free, guys. Check it out. The AI, I… this is one of my main sources of AI information news every day.

03:32:44.000 --> 03:32:45.000
It is, uh, news…

03:32:45.000 --> 03:32:47.000
What… what's the, uh, what's the URL again?

03:32:47.000 --> 03:32:54.000
dot Kagi, K-A-G-I dot com.

03:32:54.000 --> 03:32:55.000
Okay, thank you.

03:32:55.000 --> 03:33:00.000
Check it out, guys. It's a really good way to keep up with lots of news, but especially AM.

03:33:00.000 --> 03:33:03.000
They do an excellent job covering it.

03:33:03.000 --> 03:33:04.000
Yeah.

03:33:04.000 --> 03:33:10.000
I got a question. I got a question, Scott, where are you getting to Aggie at? You're not doing a Google search for it, right? Bringing it up on our browser?

03:33:10.000 --> 03:33:16.000
what… I'm sorry, one more time?

03:33:16.000 --> 03:33:18.000
No, no, well…

03:33:18.000 --> 03:33:19.000
Well, yeah, you just go to your browser and go to news.cogi.com.

03:33:19.000 --> 03:33:22.000
Where are you getting to Kagi at? You're not bringing… you're not doing a Google search, I'm sure. You're doing… bringing it up on a browser, or…

03:33:22.000 --> 03:33:24.000
To get this, it's free.

03:33:24.000 --> 03:33:25.000
Now, I use Kagi.com as my search engine.

03:33:25.000 --> 03:33:30.000
Yeah.

03:33:30.000 --> 03:33:31.000
Right.

03:33:31.000 --> 03:33:36.000
Right? But I have it integrated into my browser, it's my default cert, so if I do a search in the Omnibar at the top,

03:33:36.000 --> 03:33:40.000
it opens up in cocky.

03:33:40.000 --> 03:33:41.000
I… yeah. So, anyway…

03:33:41.000 --> 03:33:42.000
Perfect, yeah, got it.

03:33:42.000 --> 03:33:47.000
Uh, anyway, I'll wrap up with that. My god, it's 10 o'clock.

03:33:47.000 --> 03:33:48.000
Yes?

03:33:48.000 --> 03:33:53.000
Hey, Scott. Yeah, go to your settings here.

03:33:53.000 --> 03:33:54.000
Yup. Mhm.

03:33:54.000 --> 03:33:58.000
Click on your categories. Oh, there you go. See where you have Apple? Click Apple and drag it in front of gaming.

03:33:58.000 --> 03:34:04.000
Oh, I pray that, I prayed that, Robert. You don't think I know that? How dumb do you think I am, Robert?

03:34:04.000 --> 03:34:05.000
I've done that every day.

03:34:05.000 --> 03:34:09.000
Yeah. and… And then save it, and then it doesn't work? Doesn't keep?

03:34:09.000 --> 03:34:12.000
Yeah. I close it, there's no save button.

03:34:12.000 --> 03:34:13.000
Close it, and for right now, it will. Next time this opens, completely at the end.

03:34:13.000 --> 03:34:19.000
Okay, close it.

03:34:19.000 --> 03:34:20.000
Oh yeah, I've tried.

03:34:20.000 --> 03:34:21.000
Ah, gotcha. Fair enough.

03:34:21.000 --> 03:34:26.000
Drives me nuts. Yet, when I moved gaming and AI, that worked fine.

03:34:26.000 --> 03:34:30.000
But when I try to move Apple, it does not take.

03:34:30.000 --> 03:34:32.000
So, I haven't filed a bug yet.

03:34:32.000 --> 03:34:33.000
But, you know, because it's not a big deal.

03:34:33.000 --> 03:34:35.000
Interesting.

03:34:35.000 --> 03:34:39.000
Really? But it is annoyance.

03:34:39.000 --> 03:34:41.000
Anyway, yeah.

03:34:41.000 --> 03:34:42.000
Alright guys, I'm gonna stop.

03:34:42.000 --> 03:34:44.000
Fair enough.

03:34:44.000 --> 03:34:49.000
But thank you so much, uh…

03:34:49.000 --> 03:34:51.000
Yeah. Absolutely.

03:34:51.000 --> 03:34:52.000
Absolutely!

03:34:52.000 --> 03:35:14.000
Thank you, Scott, very, very much. This was a great presentation. Thank Jan Forrest. And I'm going to put the bookmark here on the end of the recording. This was Scott Graneman and Jan Carton from WebSanity. They did the talk tonight on Claude code and the new era of AI-assisted Development.

03:35:14.000 --> 03:35:15.000
April.

03:35:15.000 --> 03:35:18.000
Uh, this was, uh… let's see, it's Wednesday, August the eighth. April. April the eighth, excuse me. I can't even get the month right. Wednesday, April the 8th, 2026, and the time is.

03:35:18.000 --> 03:35:26.000
10.05,

