Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
#41 - Timothy Baldridge aka @halgari image

#41 - Timothy Baldridge aka @halgari

defn
Avatar
77 Plays7 years ago
An epic tour de force with Timothy. This is our longest episode yet so you may have to hit the pause button once or twice! He has many interesting stories and opinions so we just couldn't stop talking :) [ On this show at least he is not as stern as the photo suggests ] Links Twitter - https://twitter.com/timbaldridge Github - https://github.com/halgari Youtube - https://www.youtube.com/channel/UCLxWPHbkxjR-G-y6CVoEHOw PivotShare tutorials - https://tbaldridge.pivotshare.com/home
Transcript

Episode Naming Debate

00:00:15
Speaker
Welcome to Deaf and Number episode 41 and 42-ish because we were just discussing if we want to make it 42 already because we've been trying to get the answers for Life Universe and Everything with Mr. Timothy Baldridge.
00:00:33
Speaker
Hello. So welcome to the episode. Thank you. Thank you. I don't know. They'll be able to find the answers to life or anything, but it's great to be here. We're looking for anything. Come on. Yes. Something.

Acknowledging Supporters

00:00:48
Speaker
So before we start, we just want to I think we wanted to say thank you to the people who are supporting us, especially on Patreon. Ray, do you have a couple of members that you want to say thank you to? Yeah, I mean, actually, it's quite phenomenal. Actually, we've got like over 20 members now. And so it's pretty amazing, you know, the response from the community to
00:01:15
Speaker
to this request and people are being very generous. I want to thank some of the longest sponsors. What we'll do is we'll try and make a few name calls for every episode because we can't go through every one of them now. I want to make a call out to the guys that have been doing it for the longest. That's Claudio, Claudio, Claudio, Claudio.
00:01:39
Speaker
You know who you are. And to Sebastian and to Arturas, to Norman and to Reuben. These guys have supported us for the longest time and their generosity is amazing. So yeah, thank you very much. Yeah, thanks a lot. We'll definitely call out a few more people on the next episodes. But this is great. It helps us to cover the costs and
00:02:06
Speaker
You know, maybe we'll have a look at some, we've already sent Wouter to a conference in Berlin and now he's a closure programmer. So, you know, the love is being spread, which is good. So we are paying people to get into closure. Someone's got to do this.
00:02:28
Speaker
Anyway, yeah, no, it's really great. Thanks, everyone. Really good. Thank you. Thanks a lot. Yes. Thanks to everyone else who's doing it as well. And we definitely will get to everyone on the list. Exactly. So let's get on into the episode now.

Timothy's Work at Cisco

00:02:41
Speaker
So Tim, first, can you give us some idea about what you're working on? Of course, there are lots of closure libraries that you that you built over the period and you are a full time closure programmer as well. Yes, working on closure.
00:02:54
Speaker
Yeah, so I currently work for Cisco here in the US and I work on their threat grid team, which is we kind of do analysis of malware and in some fairly innovative ways. And I work on a team that indexes the
00:03:13
Speaker
the data that we extract out of viruses and the like and index it and present it in a way that can easily search and that sort of thing. So that's my day job. I tend to do a fair amount of database, programming, logic, database, that sort of stuff there as well, query engines and that sort of thing. And that's for about half a year now, that's been about what I've spent a lot of my time on, about,
00:03:40
Speaker
I moved on from Cognitech to, I guess it'd be about eight, 10 months ago. I kind of just took a break from open source at that point. It's like, you know, I've been doing open source on the side for a while and once in a while it's just nice to sit down and just do work and keep it at work, you know? But I always have hobby projects and stuff kicking around that stuff, programming languages and the like, I like to hack on.
00:04:07
Speaker
But what was your programming journey to get to this point with the languages?

Tim's Programming Journey

00:04:13
Speaker
Hold on. You must have started off with some crap, so let's get that out of the way. Yeah, so I started programming back when I was 10 and a half and I found a book in the library on how to program BASIC. And this was BASIC with the line numbers, right? And the funny thing was is
00:04:36
Speaker
At that time, which would have been early 90s, I guess.
00:04:41
Speaker
like those versions of Basic didn't exist anymore. Like it was like Z80 Basic and I forget, Acorn and some of this other stuff, you know. And here I am in DOS with GW Basic, right? So I was like hand translating out of these books and, you know, that thing doesn't exist in this version of Basic, but not too long after that. And what's actually cool is my dad encouraged me a lot in that sort of thing too. So growing up, my dad worked at a,
00:05:10
Speaker
at a university where computers were kind of like scientific equipment and stuff. Um, so, you know, we, we were one of those families that always had a computer in the home, but not because, you know, I was a millennial. It was just because like, you know, my dad built a Z 80 from scratch. And so let's, you know, now have that in the house. Um, but, uh, so yeah, um, from there moved on to, um,
00:05:32
Speaker
to QBasic and learned, you know, functions. What are those? Why do I need functions when I have GoTo? And from there, you know, C, Python, a little bit of C++ in there and then

Erlang and Functional Programming

00:05:49
Speaker
Really though, I think, so that was all the way up through high school and I started doing IT work and kind of gotten to a point where it was like, do I want to do IT support or programming? And I found myself enjoying programming more and more and just kind of focusing on that. So I think most of my work at that time was in Python, just small scripts and stuff, Visual Basic. And
00:06:14
Speaker
And from there, I worked with Microsoft for a while. And that was where I learned C sharp, which is kind of, you know, hey, you got to learn C sharp or Java at some point. And so that's that kind of got me into the mainstream programming. But around that time is when I discovered Erlang. So Erlang was my first functional programming language. And, you know, what I found fascinating about that is
00:06:40
Speaker
Uh, and I encourage people to do this with any language like closure or whatever is like, I remember reading about Erlang. Hey, this is a language where you don't share memory. There's no such thing as an if statement and we don't have loops. And you're like, how do you do anything that, you know?
00:06:56
Speaker
And so you can do two things, right? One is you can say, wow, this language must be crap. Or you could say, well, hey, obviously a lot of people use this. What are they doing, you know, differently? And so you learn about pattern matching and recursion and, you know, tail calls and that sort of thing and realize, hey, it works.
00:07:14
Speaker
But after a while, I just kind of got working on working in Erlang. I got to the point where the actor model was cool, but I found myself fighting it more than than anything else that I wanted to treat all my memory in a shared way immutable still, but in a shared way.

CSP vs Actor Models

00:07:34
Speaker
And so.
00:07:34
Speaker
I got asking, what languages are there out there that have cool concurrency models? And Erlang was one. And then people started mentioning, hey, there's this new language called Closure. This is 2009 or so. And so I got into that. And then, of course, articles I read around the time were like, everyone should learn a list because it'll alter your brain. So that's my journey all the way from Sonma.
00:08:05
Speaker
From basic on yeah, so you're saying did it alter your brand? Yeah, it broke it I think That is one type of alteration, yes No, no, I think I think it does I think functional programming in general does that and then lists with Just the
00:08:29
Speaker
the simplicity of lists, of being able to see how eval and apply work to build this interpreter. I mean, I think that's something that is kind of a point of enlightenment is if you can sit down and just kind of write a Lisp enclosure sometime. It's a hundred lines of code. And you kind of realize, wow, all this stuff is
00:08:55
Speaker
So simple, and yet powerful. Of course, it doubles in the details there because it's also slow. So to make it faster, you have to add performance improvements, which add more code and make it more complex. But it's cool to see. What about the sort of homo-iconicity? We're allowed to use big words here, so fuck it.
00:09:22
Speaker
So that's the thing that really always strikes me is like the REPL and the homework. And okay, we haven't got so much reader macros, but we've still got macros in Clojure. And those things are quite different. I mean, I was more of a C Java programmer. So I, of course, I'd heard of macros, but nothing like this, you know? Yeah, absolutely. So I was wondering how those affected you or those made you think?

Development of Core Async

00:09:52
Speaker
You know, it's fascinating because I had done a little bit of C plus plus before that and there's like a lot of template metal programming in that. Yeah. And I, and even in high school, so somehow in high school I can, so I was homeschooled and I somehow convinced my parents in high school to let me do writer programming language as a school project. Um, and so I've always kind of dabbled in compilers and stuff and it's a horrible piece of code. It's, it's, I think I don't even think I have it anymore, but, um,
00:10:13
Speaker
Yeah, I think I think that's something that
00:10:22
Speaker
Even back then, I was kind of thinking about, hey, I hate writing this code over and over again. What if I could have a way to tell the compiler to duplicate this, or do this, but add one to this number every time? And so if you write, like you said, in Java or these other languages, you end up writing the same code over and over, and you start thinking, there's got to be a way I can generate this. And so in Java, you do it by generating
00:10:50
Speaker
a source file, you can always write text out and then load that back up in the compiler and that's crazy.
00:11:03
Speaker
With Clojure, when you started with Clojure, what kind of applications were you building? Is it consulting for different types of projects? No, actually. Back in that time, I was working on a C-sharp app for an architectural firm in Wisconsin. We were in the HR department and we were writing project management software.
00:11:24
Speaker
And it was very much a thing. I, I listened to Rich Hickey's talk on, you know, are we there yet? And I remember listening to this at my desk while programming and just going down the list of everything he says like is wrong with software. And I'm actually typing it at that moment. We had, we had all sorts of problems with, you know, mutability and all this sort of stuff. And, and I just slowly,
00:11:51
Speaker
got more and more dissatisfied with that and decided to learn Closure. Back in that time, I started the Closure and Python project, and sometimes we can talk about how hard that is to actually be a thing that would work. But that kind of got me learning Closure. It forces you to read all of the source code if you're going to re-implement a language on a new platform.
00:12:19
Speaker
And so I did a lot of reading of closure and just kind of used that on the side to build up my closure skills. And I went right from that company to Cognitec. So Cognitec was really the first couple of jobs I had, or first job I had in closure itself. Okay. So the Python that you're talking about is Pixi.
00:12:43
Speaker
No, no. So before Pixie Beckton, I started a project called ClosurePy and I don't even know that it's still out there, but the idea was basically compile Closure to Python bytecode. And it's a little weird because there's a lot of semantics between Closure and Python that don't quite match up, things that are close, but not quite there. And really Closure
00:13:06
Speaker
really wants a nice JIT. And, you know, PyPy exists for Python, but you have this weird world in Python where you either see Python or PyPy. And it's like, you know, so are you going to write half of ClosurePy and C? Or are you going to write it in Python and then it only runs fast on PyPy, you know?

Switch from Emacs to Cursive

00:13:25
Speaker
And I made a ton of mistakes back then, too, on that.
00:13:28
Speaker
Did you check what was the language? It was high H Y. Yeah. Hi. Hi is a new language. Yeah. That actually, uh, uh, it still exists and they do that now. Now what they did was they basically wrote lisp flavored Python. So all the lists were mutable. Uh, Python has this horrific view of closures where you can modify, like if you close over a variable, you can modify it in a closure and it modifies it in where wherever the environment was you took it from, which is like,
00:13:58
Speaker
It that's terrible. So there's some things like that that high inherits. And then if, you know, if I were to ever go back and do closure pie, it's like that's this type of thing that you would want to somehow, you know, paper over and like a closure script does. So closures, there's a lot of those sort of weird semantics in in JavaScript and they're kind of papered over in closure script.
00:14:22
Speaker
Okay. So, um, I think first we need to talk about the biggest, uh, issue that we have. Um, so emacs or some other shit. So yeah. Um, I realized when I was coming to this podcast, uh, you know, I used to program and emacs and I now use cursive and I was also rage vegetarian and I'm not anymore or vegan technically. So, you know, it's been great guys. I guess we'll just end the podcast here. Right.
00:14:50
Speaker
no i mean with emacs i think we we have like a i'm a vocal majority and and ray is just yeah i mean he just needs to see the lights at some point yeah well we're getting your nemesis on this program soon so you know it's like
00:15:09
Speaker
it's like 42 episodes and i'm still trying to brainwash him at some point you know he's a bit slow but he'll catch up to emacs eventually yeah you know i used to use emacs and i you know i started sitting down the other day it's like i have co-workers that just sit i mean like they have emacs just perfect everything's configured and and i am that way about certain types of certain things in my life
00:15:30
Speaker
You know, uh, behind me is a painting from Skyrim and I will sit down and I will mod Skyrim. It's some guy fighting a dragon. I will mod Skyrim for tens of hours and, you know, install hundreds of mods and get it just the way I want. And I just don't care about emacs like that. So it makes sense. I mean, you need to have only, you know, you have got dark mode in cursive, so you're right. Right.
00:15:57
Speaker
I mean, these days I think I'm even more extreme these days because I'm using, I stopped using like iterm or something with the tabs and all that shit. So I switched to alacrity or alacrity or something. That's like a GPU accelerated Rust. It was written in Rust. It's a terminal emulator. So I have tmux and then emacs terminal within the tmux and then that's it only one huge thing with
00:16:24
Speaker
It's super fast and it's really nice. Anyway, I think I'm going back to the 70s, how people used to program the 70s and then using the GPU. You always wear flares when you're doing that. Yeah, pretty much. Anyway, yes, exactly. Okay.
00:16:51
Speaker
Yes. Let's talk about closure. Yeah. So I think maybe a couple of still positive things before we get on to the... Wow. Let's keep it positive all the time. Come on. Yes, yes, of course. Before we get on to the, I don't know, serious stuff or a friction that is in the community.
00:17:14
Speaker
So tell us a bit more about Core Async, like what was your contribution and how did you start with this thing?

Design Choices in Core Async

00:17:24
Speaker
Yeah, yeah. So Core Async, you know, I'm trying to remember what the initial kickoff thing for that was, but early 2013, so this is a few months after I started with Cognitec, I believe, come to think of it,
00:17:42
Speaker
Basically, most languages at that point had a form of async-await, or were getting a form of async-await. Python, C Sharp, JavaScript has it now, they didn't at the time. And this sort of stuff was starting to exist in Java as well. I don't know what the story is there yet. I don't know if they have those keywords. At any rate, a lot of languages were getting this.
00:18:06
Speaker
And soon Python as well. Yeah. Yeah. And so so Rich was said, you know, hey, let's let's figure out how we want to do this enclosure. And so at the time it was going to be all promise based. It was it was we needed a some sort of transformation macro like go. And so Rich said, hey, you know, you enjoy compiler stuff. Would you like to write this? And so I got working on that.
00:18:30
Speaker
Um, and then Rich went to a conference and I forget what the conference was and he came and there was a talk there by the people at go and they talked about go channels and all this and he came back and he's like, well, I hope this isn't going to cause you to rewrite a ton of code, but we're going to use channels because this is, this is way better and it is, you know? Um, and so yeah, there was actually, there was a period of time where the go macro worked basically off promises. Um, and, and that sort of thing. It wasn't easy. And there's really no change the macro based on that.
00:19:00
Speaker
But yeah, so so primarily that library was a collaboration between a bunch of people at Cognitech. So myself and Gotti.
00:19:13
Speaker
You know what? I don't say his last name often enough to not butcher it. Shaben? Is that his life? Yeah. Sorry, Gotti. But he and I collaborated a lot on the Go Macro. Rich wrote most of the channel stuff. Alex Miller did a lot of the supporting code in there, too. And then other people at Cognitife pitched in for the different
00:19:41
Speaker
parts of Core Async. There's a weird like reordering queue time based queue thing in there that somebody wrote. I forget who it was now. But yeah, it was kind of a collaborative piece of work.
00:19:53
Speaker
But how do you contrast this with, because you had experience with other concurrency models, like actress, for example, in a line. So how do you explain like, I don't know, like I'm five. Well, I'm five in this, in this domain anyway. So if I can get it, then that's easy. Yeah, so there's a, the concept involved with Core Async is called CSP, Communicating Sequential processes.
00:20:20
Speaker
processes. And I've been doing a lot of work with continuation passing style, which is CPS. And so I had to think about the acronym. It's a completely different thing. Yeah. So CSP really talks about the idea that you have processes. You can think of them as a
00:20:35
Speaker
a process in an operating system, and then you have channels in which they talk to each other. And there's even a form of calculus involved in that, and it's PyCalculus. Yeah, PyCalculus, yes. Because as it turns out, the most basic form of a channel is a handoff.
00:20:51
Speaker
If someone is trying to put onto the channel, the put does not succeed until someone takes the value. And so there's a certain amount of reasoning that can be done there about the correctness of programs and some other things like that because you have that very it's not as asynchronous. There's no buffers involved. And so it's a lot easier to reason about. But yeah, that contrasts with the actor model in some interesting ways. So in the actor model,
00:21:20
Speaker
you have the first-class thing in an actor model really is the actor itself. So actor models, you can do kind of three things. You can start an actor, an actor can send a message to another actor, and an actor can say how the next message it receives should be handled. And that's like the core of what the real actor systems are. Erlang adds other things onto that and whatever.
00:21:46
Speaker
And then CSP says something different and that is you have Processes and you have channels and so all you do is put something on a channel and hope somebody on the other end takes it So the first class thing in that sense is the channel which which works fairly well in a the way we build a lot of our systems today I've built systems where you know, you scale that up where your processes are a machine and
00:22:11
Speaker
And then you have a distributed queue, Kafka, you know, RabbitMQ, whatever, and you just put something on the queue and you don't care what happens to it. Somebody will pick it up and do something.
00:22:21
Speaker
But there are some differences there. And I think that's something that we haven't seen explored in the Closure space. And that is there's been attempts to do Erlang's OTP. So Erlang has this idea of OTP, which is like this whole framework of how to restart processes, how to do these sort of things. That's harder in CSP. And it's outright impossible in Closure, its core async, because
00:22:45
Speaker
you can't even talk about processes in core async. Processes are not a first-class thing. They're not reified. You can't say, is this thing running because you have no way of saying thing. And so I think there's work to be done in that area. And I'm not sure what that looks like.
00:23:08
Speaker
But what was the what was the reasoning behind picking CSP based concurrency versus I don't know bringing actors into closure. Is it because some other libraries are there already like occur. Yeah. So there's a couple of things there. One of them is actors in and of themselves. So what I what I said earlier is
00:23:28
Speaker
It's interesting. Those three primitives are all that's really required for an actor based system. And the one of them is send a message to another mailbox that is to another actor. It's really a mailbox. Just you send a message to an address like sending a message in the mail. You don't know if it got there. There's no guarantees it got there. It's asynchronous. You can just send a fire hose of mail to somebody and who knows what's going to happen.
00:23:52
Speaker
Um, and so, you know, uh, I think it was rich. I heard that said this first, you know, what is the first thing people do with actor systems as they build a queue? And it's kind of true. It's like, what is the first thing you want an actor system? Well, you want, you want acknowledgement. Like I sent the message and I wait for reply. So if, what if I send it and they got it, but I never get the reply. So you have to keep resending the message until you get a reply from them saying, yeah, I got that.
00:24:19
Speaker
And you have to keep sending back to you until you get that. And so you end up building a certain amount of communication on top of that. And when you're all done, that's a channel in a way, right? And so the actor model works great in distributed systems where as it turns out, the network is unreliable. You send a packet to a machine, you don't know if it got there.
00:24:43
Speaker
Um, if they acknowledge the Senate act, you don't know when the acts going to get the, you know, um, and, uh, but on a local box, it's interesting because you can build CSP on a machine itself because you have things like a lock, you have locks, you can, um, uh,
00:25:00
Speaker
kind of a transactional type system is possible to build on a machine. So in a lot of ways, Clojure has this philosophy of, you know, let's not start with the worst case scenario, which is a distributed environment. You know, we're going to we're going to start with what we can do on the local box, immutable data, shared shared access agents, atoms, those sort of things you can't build in a distributed system. But that doesn't limit us from building from using those in a single system.
00:25:31
Speaker
Yeah. But in that case, isn't that the difficulty to scale it up to the distributed systems? It is. Then you need to bring in different kind of semantics into the system. It is. Yeah. And I think that's something where, I mean, Core Async does help you with the mindset of you're used to building software with queues, and then we're going to go and we're going to use a distributed queue. So you can kind of swap out that channel for a queue. But there are semantic differences.
00:26:00
Speaker
Like a lot of things in Closure, Closure gives you a tool, Core Async gives you a toolbox, right? And so, you know, I think it would be great if there was a more cohesive view of what it's like to build systems here with this sort of thing. But on the other hand, the more you work towards that, the more you have, you know, here's how you build a distributed system with Core Async, then stuff outside of that box.
00:26:24
Speaker
Once you give someone a hammer, they're going to look for nails, I guess is the way of saying it, right? There was this effort to build a distributed atom some time ago, probably somebody from Cognitect, I'm not sure if you... Yeah, so years ago there was a system... Based on using Zookeeper and MongoDB and then trying to build something.

Challenges in Scaling Concurrency

00:26:45
Speaker
Yeah, and I think one of the last
00:26:49
Speaker
Issues was if you go to the after what was the name of it now? Yeah, exactly. Yeah, one of the last issues in the github repo is uh, I want to say it was I'm not gonna name names. I forget who is now. Well, basically someone said hey
00:27:07
Speaker
Uh, if you go look at the zookeeper spec, this can never work. And basically like that was, that was the end. Yeah. Uh, because as it turns out, there's problems like, like closure is very much based around this idea of, of, um,
00:27:22
Speaker
things existing a certain way within a computer itself that you can do things like validation. You can do things like atomic updates and that sort of thing. And once you get in a distributed world, that's not the case. And I tell people that too. It's like I have actually sat down and written a channel, a core async channel that works across a network.
00:27:47
Speaker
Except it doesn't you know, there's still those little tiny things were in these cases
00:27:53
Speaker
It doesn't quite work. Like if you pull the network plug, it's gonna have different semantics than it did if it's within the same box. And so the question there becomes, should we just use a different programming paradigm in those cases? One of the things that, and again, I don't know, to me it's something which I don't know why it can't be done, but which seems to be against it, is this concept of like,
00:28:20
Speaker
knowing a bit more about what's going on in core editing, so understanding how big the channel is, how big the buffer is, inspecting what's in flight. That seems to be missing and that seems to be, to me, quite a
00:28:34
Speaker
I mean, you know, if we could have that data, more tools could be built on top of it. But that seems to be kind of almost a deliberate design choice to say, OK, we're not going to give you that. And that to me, I would say the problems with Core S-Sync are like that, like, you know, lack of instrumentation or lack of ability to introspect or look into that thing.
00:28:55
Speaker
And then secondly, I think that gives problems when you're getting sack tracers or you're diagnosing what's going on. Well, where did it go wrong? What happened?

Core Async's Introspection Challenges

00:29:05
Speaker
Where were these threads? That doesn't come across very well in a podcast, but it's crossing the streams in some way.
00:29:17
Speaker
You know, no, I think I think you're I think you're right in the sense that that sort of thing is is needed. Now, I will be quick to say I I have never heard Rich say that as a you know, we don't want. All right. OK. In fact, I've heard kind of the opposite that, yeah, it'd be nice to have more monitoring and this sort of thing. But but it's one of those things where it takes a certain amount of design thought. Right. And and I've gone down that road to of saying, you know, do I
00:29:46
Speaker
So I wanted to devote a year of my life to figuring out how this should be done. And then, of course, there'll be people who disagree with me or whatever, right? But there are some things there that, in Core Async, that make it hard to do that without changing the library itself. So if someone wants to take that on, I would say one of the first things you probably need to do is fork the library and add some more stuff to it, like the Go macro itself. And there's people that have done this. There was a talk at...
00:30:14
Speaker
a cons or closure west a while back where someone did that and they they installed hooks into core async redeft a whole bunch of vars and so they got they could do some cool things too like actually um diagram how the channels talk to each other or serialize it whether you could step through a core async program by pausing the channels and and you know that that's great um so the the thing is there though is those hooks have to be
00:30:42
Speaker
have to be there. And so, you know, yeah, I would say what would need to be done in Corrie Sink would be to add those hooks in the main line or someone needs to fork it, you know, and add that stuff in. But I guess the biggest question to me is like, it feels a bit like nothing's happened with it for a long time. It's like 0474 and that's it. I mean, is it still actually, are people still looking at this thing? Is it still like getting some love?
00:31:09
Speaker
Well, I think that's the age old question for a lot of closure libraries. Is it perfect or is it dead? Yeah. Well, no, it's not perfect. That's an interesting dichotomy. I would agree with that too.
00:31:27
Speaker
But but there's yeah, so so it was it's been a few months back now a few months a few years now It's it's now core I think it's part of the mainline closure development process. So spec tools depth I think it's part of that closure core the closure language itself and core async are all kind of under the umbrella of using Jira and that system, you know and so
00:31:54
Speaker
Yeah, it does run into one of those problems of I think there's things there that need to be fixed, but they're design questions and and it's it's a little hard getting traction to improve.
00:32:08
Speaker
things in closure or libraries managed like closure if they're design questions, because it just requires someone to sit down and think of all the scenarios and then run it by the powers that be to make sure that it's acceptable, I guess, and that takes a lot of time.
00:32:28
Speaker
But this is one of the interesting things in the closure community, the entire discussion around the development process and lacking some sort of a direction. Because recently I started reading something about probably I think most of the people know already.
00:32:43
Speaker
I see a lot of parallels between the way Closure is developed and then SQLite Library, that one is developed because they have this core group of developers and only they can change the thing. SQLite is available as a public domain thing, so you can use it for whatever you like. We don't want to complain, but there is a group that knows the stuff.
00:33:08
Speaker
But then I think that the analogy breaks down because SQLite is just, okay, you get it and then you use it. And Closure is much more like a language. So I don't know what is your opinion about having this kind of development.

Critique of Clojure Development

00:33:22
Speaker
Yeah, it's it's an interesting problem. So, you know, probably some people are aware I've been a little vocal about this on Twitter. But, you know, I think I think one of the things that needs to be said is by by all means, I consider myself some weird mixture of conservative and libertarian. So it's like I very much believe in personal responsibility, personal agency. You know, you can do what you want with what you did. I have no right to tell you what to do.
00:33:52
Speaker
Uh, and, uh, and that's, that's great. You know, it is one of those things where, you know, if you want people to continue to use something, uh, it has to, it has to be appealing to them somehow, I guess. Um, so, I mean, that's, that's one of the things with, you know, with core async is the question is, is do people stop using it because it has sharp edges?
00:34:17
Speaker
know, and they don't necessarily want to continue to experience those. And I know that's been true in my own work is is I actually at at my current job, I don't even know where we use Cori's think I don't encounter it much if we do. And that's mostly because of like you said, the rough edges of stack traces, you know, and then these are things you experience in all sorts of
00:34:41
Speaker
Um, libraries, you know, stack traces and, and the rough edges, but is it something we could improve? Um, I sorry, I was going to say it was all languages that use async have by default, more complicated code, you know, and then it's just kind of the way, the nature of the beast.
00:34:58
Speaker
But it wouldn't be possible to improve that. Could we somehow install some sort of hooks in the Go macro to say, hey, if this thing dies, let me know. Or register it in a global atom to say, what are all the Go blocks currently running? And these are all things that have their trade-offs. That's kind of the problem here. I'll give you one example. There's been a debate for some time.
00:35:26
Speaker
Using the you know, so within a go macro, you're not supposed to use a blocking take or put and and you can't use The parking take input inside of a normal threat because it's it's a macro thing and it'll actually throw an error But the other way you can do it, but you're not supposed to so if you do a blocking take or put within a go macro It just hogs the thread And so someone came up with the idea and this has been around doesn't work on JavaScript
00:35:54
Speaker
Right, right. Exactly. But it's a nice construct to have. And so the question is, can we do some sort of macro logic or magic with dynamic vars to throw an error? When you start the go block, you set a little flag and say, hey, we're now in a go block. And if you try to run this function, we do a check to say, are we in a go block and throw an exception?
00:36:18
Speaker
That does have a small performance penalty. We're talking about a hash map lookup for every time you do a block or take. So the question there becomes, is it worth the performance impact for removing this rough edge?
00:36:37
Speaker
we can have a debate over whether it's right or not. But at the end, it kind of, like you said, it needs someone to stand up and say, this is the goal of the vision for the library. And I think a good counter example to this is reactive extensions, which Microsoft developed. And it's kind of sort of not really a competitor to Core Async. It's kind of like Core Async and transducers.
00:37:06
Speaker
And it, but what they have is every single primitive in there even has like a visual diagram. If you put this in, here's the output, you know, and the whole spec is designed from top to bottom so that when you go and implement it in a new language, it's easy to say, hey, you ported this wrong or you did that thing wrong. And they thought about how errors are propagated in.
00:37:25
Speaker
And, you know, errors are actually mentioned. But that comes back to that closure, that comes back to that closure philosophy of closure is in general optimized for correct programs. And so the question becomes, you know,
00:37:46
Speaker
What does that mean for your everyday development? If you write a program that has a bug, do you really just want to, you know, it's, it's, it's, so at one point, at one point, if you had a transducer in a core async channel and that transducer through an exception, it would hard lock the channel.
00:38:02
Speaker
Like you could never put anything else into that core racing channel. Nothing could be taken out. It was just dead. There was no way to touch it or do anything with it. And so, you know, that's and so eventually we got air handlers and channels. That was one of those things where it's like, but wait a minute, you know.
00:38:19
Speaker
were no longer optimized for correct programs. But in that case, an incorrect program was painful enough, I guess, that it was worth putting that check in and that small performance impact. Saying, no, don't do that is a fine response to an error, except for when the way you say that is by shooting someone in the foot, literally, at that point. There's got to be a better way. Who cares that much about performance, though?
00:38:48
Speaker
I'm going to just say that out loud because, to me, the whole point about swapping threads and all this kind of stuff is that it's all predicated upon on the fact that the CPU has got tons of time. I mean, it really has got tons of time. And you're not going to be CPU bound. You're mostly going to be IO bound. So putting a check in there,
00:39:12
Speaker
You're running closure for fucks sake, which inherently is an interpreted language. I know you can compile it and all this kind of shit, but I don't know. It just seems to me like to go so far to say, okay, everything must be perfect because we are only optimized for that.
00:39:35
Speaker
For me, it sounds like an excuse. It sounds like a weird way of thinking about things because putting in some checks or some guards to protect programmers against foolishness, known foolishness.
00:39:51
Speaker
It seems trivial. It seems like what everyone would do. And I've heard that optimized business, but I don't get it. I don't know where this is coming from, actually. I've never actually heard Rich say it out loud, so maybe he does believe that fully. I don't think so either. No, yeah. Yeah, it's an interesting question. And the two things come to mind with that. One is I had a coworker tell me once that the closure community really fetishizes performance.
00:40:20
Speaker
like to add to an unhealthy level. And it's kind of true. It's like, you know, if you if you show somebody that using VEC versus into, you know, vector is two percent faster, they're like, well, I'm always going to use VEC, you know, but but also we're not that honestly.
00:40:43
Speaker
But it's true. I mean, like people start to think, you know, they extrapolate that, you know, one of my favorite examples there is multi-methods, right? I was working with multi-methods once and I asked someone, you know, is there a faster way to do this with multi-methods? And they're like, have you benchmarked it?
00:41:01
Speaker
No, I haven't benchmarked it and they're like benchmark it and then we'll talk about it when you can prove that multi methods are slow And I never went and rewrote the code. Yeah, multi methods were always fast enough. Yeah, and you know, yes, they're slow You know for some value of slow. Yeah
00:41:16
Speaker
Yeah, but what you said is actually true. One of my favorite speakers to watch is Martin Thompson. He does a lot of work with high frequency trading and the disruptor
00:41:32
Speaker
queue, and that sort of stuff. And he has a saying where he says, immutable data structures are like sausage. They're all well and good until you go in the kitchen and see how they're made. And it's true. If you compare and assert, let's check the type of a value, but when it comes into a function, set. We're going to do set union. Make sure everything that comes into the set union call is a set.
00:42:00
Speaker
That check is way cheaper than the actual cost of adding something to an immutable collection, where you have to copy three arrays and copy all the values over and do a bunch of math. It's way more expensive. To me, immutable data structures are just provably usable.
00:42:24
Speaker
You know, which is great, you know, I mean, cause they help give you a shitload of affordances as a programmer, which is what you want. Absolutely. Right. Yeah. So, so, so my, kind of my point with that is like, we've already taken the, the actual power of a CPU and reduced it down because we want certain convenience of immutable data structures and that sort of thing. Um, you know, we've already cut whatever the throughput of our system, quote unquote, and half. Yeah.
00:42:52
Speaker
by using a dynamic language and using immutable data structures and maybe someone forgot to type hint reflection somewhere in our closure code base. And then we're worried about adding an assert. And in some cases, this is the thing I like to point out with people that the hotspot JIT is really, really good.
00:43:16
Speaker
And so there was a time I actually went in and type hinted the, like how much slower would the closure core set functions be if they just checked to make sure that their inputs were a hash set before they did any work. So putting a certain, and it got faster.
00:43:36
Speaker
And I benchmarked that so many times. And as far as I can tell, it just realizes that, oh, I don't have to do casts in the function. I used to have to do five casts, and now I know the input is of this type, so I don't have to cast anymore in these cases. And so it just removes a bunch of code and, you know. So anyway, yeah, it's back to core async, I guess.
00:44:01
Speaker
Yeah, but I mean, I think the general point is that we're talking about, like, let's say the philosophy, aren't we?
00:44:11
Speaker
whether you say it's optimized for correct programs. I'd like to see that somewhere written down in the closure, like rationale or something. It doesn't seem like a decent rationale to me. I've heard Stu say it, but I don't know. Stu said it, and I'm not sure that I've ever heard Rich say it. I'm not trying to say that Stu is wrong or something. If that's what it's optimized for, then fair enough, let's have it out there. But it seems like a weird goal to me.
00:44:43
Speaker
Yeah. Yeah. And I, and I, you know, I think, I think the problem maybe with that statement is that it doesn't recognize that there's.
00:44:54
Speaker
that all things, taking anything to the extreme, it's a problem. If we want to optimize everything for correct programs, then when I segfault in the JVM, it should crash my computer, like the OS, everything. That we do have certain,
00:45:14
Speaker
safeguards in place. Hey, I used to program GW basic, like I said, back in that day, when you access memory, you weren't supposed to, you had to hit the reset button and, you know, reboot DOS. So, you know, I don't really want to go back to that. I've been doing some IoT programming recently, and it's like that still, you know, kind of like on the bare end, bare end of an operating system, you know, which is really just a scheduler. And it's like, you know, this is horrible.
00:45:42
Speaker
This is a really horrible experience in C at the assent of civilization. It's not nice. It really isn't nice. I think the correct response to that is schedulers. We would have loved schedulers back in my day. We had one thread.
00:46:05
Speaker
speaking of the type level stuff. So I hear many you programmed in Erlang, you programmed in Python obviously and other languages. So
00:46:18
Speaker
What is your experience on the other side of this debate, like the fully static type checking stuff? Yeah, so I forgot to mention the language there. So I was on the .NET platform before I went to Clojure. I was doing that at my day job. And so I spent some time in F sharp and even like
00:46:38
Speaker
I think I even played around with adapting some of Closure's concurrency semantics to F-sharp. And then a couple weeks ago, I went back to F-sharp. It's just like, hey, I was playing around with some programming language development ideas, and I wanted a language with tail calls, and so, hey, F-sharp.
00:46:55
Speaker
And so that's my experience there. And yeah, I like it. It's one of those things where if I'm writing a compiler, static types really kind of help. I've done a lot of work with the PyPy tool chain, which is statically typed but fully inferred. So you never specify types.
00:47:15
Speaker
they just, it tries to go through your entire program. It's like, hey, your main input takes strings, an array of strings and returns an

Pixie Lisp Language

00:47:23
Speaker
integer. So based upon that, we should be able to infer the entire rest of your code base. And so it just throws out an error at some point, like, hey, you used an int here and use the string here and we can't unify this, you know. So writing compilers and stuff static types are kind of nice for that. Once you get into more,
00:47:44
Speaker
data-driven systems. That's when you want a, where the dynamic languages went out, specifically data-driven languages like Clojure.
00:47:58
Speaker
Yeah, because I was thinking about the comment that you made a few minutes ago that you said, you know, build closure on top of closure, because one of the things that Haskell people say that Haskell is one of the best languages to build other languages. Haskell is a really good compiler for other languages, so you can build on top of that one.
00:48:18
Speaker
And I also saw maybe I didn't understand properly in your GitHub, you have closure on F sharp or something like that. Oh, yeah. So I went a while back and started writing a list on different platforms. And and part of that was writing a lisp and lisp and then making it so you could write the lisp and a lisp and a lisp. And I have this cool, cool file called
00:48:43
Speaker
I think it's just called deeper and it just keeps loading a lisp and a lisp and a lisp until your machine grinds to a halt. And it's kind of a fun thing.
00:48:54
Speaker
Yeah, so it's kind of fun. So one of the things I really enjoy studying about this sort of thing is I watched a talk by, let me pull up the name of it here. Nada Amin did a talk a while back called Collapsing Towers of Interpreters. And it's a work in that she and another,
00:49:19
Speaker
a gentleman I've been working on first for some while. But it's kind of the idea of being able to write a language in a language in a language and have the runtime be able to collapse that down. So you can write a lisp and a lisp and a lisp and a lisp. And the final one is just as fast as the first one. And it does that through a weird form of partial evaluation and that sort of thing. And that's kind of what I've been spending a lot of my free time in is just
00:49:49
Speaker
kind of trying to unify that with other concepts I'd like to see in a programming language.
00:49:56
Speaker
Yeah. So, let's get to the other topic that we wanted to talk about. So, how do you handle all these error messages in Clovis? Yeah. So, one ten- This is the topic of the year. Yeah, one ten's gotten better in that way. Once one nine came around, well, actually, one spec was in Core. I
00:50:19
Speaker
I stopped reading it, the messages, frankly, like, like I'll, I'll write a macro and I'll run the macro and it presents a wall of text. And it's just like, well, I just wrote two lines of code. So I can go stare at those lines of code or stare at this lines of code. Yeah. So it was really bad. But, um,
00:50:41
Speaker
Yeah, you know, I we I've used spec a couple times in projects and most of the times I've ended up removing it after a while. It just it didn't quite give the.
00:50:54
Speaker
the power that I was looking for. And mostly just because it was so hard to debug. While my debug process a lot of times in spec revolved around, you write a spec, your code doesn't fit the spec for whatever reason. It gives you an error. For some reason, it's hard to figure out why it's not that way. So go fire up the generators and have it generate some data that does fit the spec and see where you went wrong. And often it's,
00:51:24
Speaker
I wanted a vector spliced in in this case instead of a nested vector or whatever. So, I mean, it's a self-fulfilling problem with spec, right? It's optimized for perfect specs, Tim. Yeah, right. We do need better tooling. Blow the belt, that one. No, I understand. No, we need better tooling for spec, but it's
00:51:51
Speaker
you need better hooks to write that tool. So as an example, one of the things that, one of the reasons why I haven't used spec a lot is a lot of the code I write is data driven, which is funny to say. You're not using spec. Yeah. But in the sense that I worked on a system that used, didn't use spec, it used schema. And in schema, there's a prismatic schema. They have some other libraries with swagger and the like, so you could write an HTTP endpoint,
00:52:19
Speaker
you write your schema and then you say, hey, give me swagger for this. And for those that aren't aware, it's a formal spec in JSON that's emitted out. And then there's a JavaScript thing you can put on the front end. And that's how we did our documentation, was that we would write our schema and we would tell the other teams, here you go. If there's something you don't understand, tell us and we'll update the spec to give you better documentation. That's technically possible in spec
00:52:50
Speaker
enclosure spec, but it's not reflective. Once you have a spec, you can turn it back into the raw code that it came from, but you got to then interpret that code. Do you want to interpret that code using spec? Then you get specs of specs and this whole thing. What I really want is a data-driven system where I can tell spec, here is data that describes my schema,
00:53:19
Speaker
And at any time I can say, give me the data definition for that spec.
00:53:26
Speaker
Yeah. But what, what was the, is it just, of course, I understand the error message part of it, but, uh, what are the practical reason that you're not, uh, that keen on using spec in, in your programs? Because it does give some value, right? I mean, it will validate, you know, you have all the generators coming out of the code that you've written. Right. So, so it's, it's very much, there's no tooling for it.
00:53:50
Speaker
Like, you know, with schema, you can sit down and plug the little different libraries together and get your endpoint. And so at several jobs now, you know, several places I've worked at, the problem has very much been they already have an endpoint in schema.
00:54:10
Speaker
that has swagger or this other stuff involved. And being able to, and then that tooling doesn't exist in spec. So you can have a world where half your system is in spec, half is in schema. Sure, that's fine. But you got to maintain both. Or just really like, are we going to write a new system? If we're going to write a new system, we don't want to have to develop those libraries that hook up to swagger or whatever. I think it's a timing issue in some respects, though, isn't it Tim?
00:54:41
Speaker
I would say that schema is pretty much on its way out. They're not really loving it because I think they see that the writing is on a wall that people will start using spec for things. And I agree with you. There's some lamentation there about that because it's a very nice library and those guys did great work. On the other hand, I mean, for me, I agree with you. And things you were saying the other day on our proper thing that
00:55:11
Speaker
that he thinks that there will be more tooling and more data around spec. And that's obviously for the good.
00:55:19
Speaker
But we tend to use it as for like greenfield stuff for collaboration, just to kind of, you know, even if we would throw it all away, it's a kind of nice way to center the data definitions when you've got a team of people with quite, you know, with a lot of interfaces across the network where you've got different people responsible for different things, then spec seems to be pretty good at centering us all on, yeah, okay, that's what we mean when we're sending this, that's what we mean when we're sending that. As a kind of discussion piece,
00:55:49
Speaker
Especially, it's nice because if you do that in something else, it's not as easy to reify it. It's not as easy to say, oh, yeah, this is what the actual data will look like because those generators, as Vijay says, are not there. So how do you do that with JSON or how do you do that with a Word document? Which is what we're faced with.
00:56:14
Speaker
Right. And I think that's an interesting point that in the the real world, so to speak, your teams use different languages. Your front end team may be writing JavaScript in your back end teams closure or, you know, you got to talk to a node JS service or something like that. And so the question becomes, you know, how useful are those specs?
00:56:33
Speaker
if you, if no one else can read them, which they currently can't. So then it becomes a question of can we translate them to something? I mean, this is the way I think, right? It's like, if they want swagger or they want, you know, some whatever format, whatever the flavor of the day is, I need a way then to extract data from the specs in order to generate that. Or I go the other way.
00:57:00
Speaker
There was a lot of people doing that. They've got some tooling, too. Right. But it's a little weird in the sense that what you get is you get S expressions. You've got to parse S expressions. So like in a lot of my stuff, I want an AST, not Lisp code.
00:57:25
Speaker
So, of course, you've been building different types of languages and stuff, mostly lists on top of these things. So, essentially, two-part question. One is that what is pixie and how are you building that one?

Evolution of Programming Languages

00:57:41
Speaker
And the other side of it is that just before we started, we were discussing it's been 10 years of closure. So, you know, how should, in your opinion, a language should evolve around the community and how should it be? What would be the ideal stewardship for a language?
00:57:57
Speaker
Uh, right. So, um, you know, this may go on for a little bit. I, I, as long as you guys want to talk, I'm good to talk. Yes. Yeah. So let me know when you run out of hard drive space. We're in cloud. You just keep on talking. Exactly.
00:58:17
Speaker
Yeah, so so start with for Pixi. So Pixi is funny. I originally started Pixi as a writing a list on the Pi Pi platform. I could talk for some time about what I like about the Pi Pi tool chain, and it's got some really cool features.
00:58:35
Speaker
And there's not really, well, in the time I wrote Pixi 2, there's a packet it's called, a racket on the PyPy toolchain. But at the time, there wasn't a whole lot of lisps on there. So I wanted to do that and built some cool concepts into it. But also, I didn't quite go far enough with it in the sense that I ended up with something that looked like
00:59:03
Speaker
like closure written on PyPy. And my focus now in a lot of languages that I, and I'm working on kind of the successor to Pixi, is I want to push immutability further into the language, make things like namespaces and even the entire interpreter itself immutable.
00:59:26
Speaker
And I had a branch in Pixi that did this for a while and I didn't know how to make it fast. And that's been about two or three years ago. And I think we can actually make it usable now. But the whole idea of like what happens when each time you execute a function internally, you get a new copy of the of the interpreter, right?
00:59:44
Speaker
and that sounds crazy but in essence that's kind of the way some of these these systems work and and you can optimize that like with transients you know transients are immutable because nobody ever cares about the old version of the data structure right and so you can do some tricks like that to make it um at any point you can say get me a copy of the interpreter at this time and and you can do that okay um but but going back to with pixie that's kind of what i ended up with we can
01:00:12
Speaker
What do I do that for? Apart from the satisfaction of doing it, what's the motivation for doing that?
01:00:19
Speaker
Right, so there's a couple of papers I'll direct people to if they want to read up on this sort of stuff. So there's, let's see here, let me get the right one here. Programming with algebraic effects and handlers is a paper that talks about taking OCaml and putting an effect system in it. So OCaml is a language that is mostly immutable, but you have some escape hatches. And this algebraic effect system has a way of doing IO
01:00:47
Speaker
in a functionally pure way without going to monads. So the return types of your functions don't change, but yet your entire program is immutable. And you do that through
01:01:04
Speaker
basically delimited continuations and some polymorphism and that sort of thing. So I would like to see that combined with some of the collapsing towers of interpreters. So I love metaprogramming and that sort of thing. And what
01:01:23
Speaker
Let's see here. The collapsing towers of interpreters is the paper. You want to look at that. What they talk about there is like being able to write an interpreter and say, OK, I have a function that's my interpreter in my program and I'm going to give it a program. And I want you to partially apply this program to the interpreter. And what you get out is a compiled function that is completely optimized
01:01:47
Speaker
for that program that you handed the interpreter. So it basically can, you never have to write a compiler is the thing. That you write a regex interpreter, you hand it a string, and it gives you a compiled function that is fully optimized to do what that regex was supposed to do. And you don't have to actually write a regex compiler. That's the goal for this sort of thing. So the problem with that sort of thing is if you partially apply a function,
01:02:15
Speaker
and you have side effects in that function, reading from disk, writing to disk, that's gonna change the behavior of the function. And so my idea is to use this algebraic effect system to stub out IO. So to basically say, you can think about in Haskell or the like, you use the IO monad to say, go run this function, but don't do the IO yet. And so we're kind of gonna,
01:02:45
Speaker
put those two things together to say I want a dynamic lisp that is functionally pure but doesn't use monads because I kind of hate monads and allows us to do this sort of cool meta programming.
01:03:00
Speaker
So why is your hatred for monads so pure? No, really, it's just the types, the fact that if you have a, if anyone's ever written a monad, it changes the function signature of your function. So if this thing uses the state monad, it no longer takes a string and returns an int. It takes a string and returns a state monad event. And in a dynamic language that's
01:03:29
Speaker
That's a fantastic way to. To. Drive you to drive. It's not so bad, so it's funny in F sharp. I have sharp has these cool things called computational expressions. It's like the go block, but as a first language first.
01:03:48
Speaker
as a first-class thing, right? So you can say, hey, within here, redefine let as this and redefine do as this, and you can kind of, within a block, redefine all the basic language constructs. And that isn't hell on earth because of the type system. That, you know, when you actually try to compile it, it'll say, you know what, you said this is a let, but
01:04:09
Speaker
the values you take here aren't quite lining up. And so you can kind of work with the compiler to make sense of that. But in a dynamic language, you don't have those safeguards. It's like, so why don't you just use types, Tim? No, it's fine.
01:04:28
Speaker
Yeah, so that that's the that's the goal for pixie for you then so you want to have this immutability all the way down.

Future of Pixie

01:04:36
Speaker
Yeah, so so the interpret that's kind of that's where I was kind of going with pixie a while back and I and I, you know, at this point, I did enough
01:04:47
Speaker
I would say wrong. It's just I learned enough in the process of Pixi that I'll say it publicly. Pixi is I haven't touched Pixi in like two, three years. So it's not really a thing. But I but I am still prototyping some of these other ideas and ideas we learned or I learned in Pixi and I'd like to build something with that.
01:05:08
Speaker
OK, cool. So the other part of the question was that, well, probably kind of an independent question, was that, you know, there has been a lot of discussion, I think, you know, on Twitter as well about how the language was built and how the community is around closure and what are the priorities. So in your opinion, how should the language priorities change over the period when the language is getting bigger? And how do you see the community of closure?
01:05:37
Speaker
Yeah, that's interesting. So, you know, I think you have to recognize that languages have kind of growth cycles, I guess, if you will, and that you always have the language hoppers that come on at the beginning. That it's a new language, it's cool, I always wanted to learn a lisp, and so here's a good lisp, a modern lisp.
01:06:00
Speaker
And there's also cycles, not even cycles, just growth in the programming community in general. I was thinking about this the other day that when I started with Clojure, the idea of an immutable data structure, like you would say that and everyone's like, what are you talking about? And isn't that really slow and this sort of thing?
01:06:17
Speaker
And you had to explain how, yeah, it's a little slower, but not enough that you care about. And people recognize that. I mentioned Martin Thompson earlier, but even he'll say in his talks that if you're not in the core of your program, this stuff is still nice to have. I think he said that at one of his talks. And so today, we fast forward to a point where almost every major language out there has immutable data structures, at least as a library. C-sharp, Microsoft maintains their own immutable data structures as part of an add-on library for C-sharp.
01:06:49
Speaker
And a lot of these features of Clojure we now see in other languages as well. So macros, Elixir has a pretty powerful macro system and a lot of these sort of things. So when you talk about growing a language, I think you have to...
01:07:06
Speaker
Figure out where you want it to be what what what do you want the language to be? because if if people just want immutable data and Ruby syntax, they're gonna go to elixir, you know, yeah, so on that note, I
01:07:27
Speaker
I'm not sure where the designers of Closure want to take it, but my impression is that they want it to be a powerful language that can be used by enterprise class companies to kind of harness the
01:07:47
Speaker
power that they have available, like the infrastructure they have available, as well as the kind of wrangle in the business complexities of the system. And it's one of those things where if you're optimizing for that, if you're optimizing for people that are gonna spend five years, they're gonna reroute their career to the language, then yeah, maybe some of these other things like error messages or some of this thing don't matter as much.
01:08:15
Speaker
But does this view align? I don't want to put you, I don't want to put a lot of spotlight on you or something, but the idea is that the question is that does that align with because you're working with, you worked within the guild of Closure Core and this view, is it something that you have seen already or is it something

Clojure Community Concerns

01:08:38
Speaker
that you are observing as a proxy?
01:08:41
Speaker
It's something that I mostly observed as a proxy, so I should probably make that clear that even at my time at Cognitec, that was not necessary. I wasn't part of the core team any more than anyone else is. I mean, I had the same rights and privileges that any person that wrote a patch did outside of Cognitec.
01:09:02
Speaker
Um, but I, I think what my, I think what my concern is there that you have to have the young blood comments. Right. Is that, is that if, if closure, you don't want closure to become a,
01:09:17
Speaker
COBOL type thing where it's like, like only, you must have a rack of five servers in order to use this language or, you know, or even, even, you know, you must be an AWS. I haven't, I haven't, you know, a lot of, a lot of the atomic enclosure, um, you know, the atomic ions and stuff you see going to AWS and that's great. That's, that's cool technology. On the other hand, I haven't worked on a project that uses AWS in about four years.
01:09:45
Speaker
just because all of the big companies i've worked for have their own infrastructure or you know that sort of thing um and at some point you have to have you have to have the the new people come in and and are they going to want to do that or are they going to use elixir or you know scala or cotland or or whatever whatever the newest flavor of the day is
01:10:09
Speaker
And so, you know, for me, for me, it's fascinating. I came to closure and I found it a breath of fresh air in the sense that I had never seen a tech conference like Closure Cons, where you walk in and people are like, so I read this paper, you know, the people that wrote Instaparse, Engelberg. I always laugh at that. Apparently that thing holds things started when like they
01:10:33
Speaker
Hey, here's a paper on how to write a parser. Hey, son, go, go write this, you know, and I'm trivializing it, but it was more or less along those lines. And that sort of stuff is cool. Core match is the same way that was developed by reading a paper and, um, you know, Hey, I'll come out and say it. I don't, I don't see that anymore. I don't, I don't see that in the closure community. Uh, what we have today is, is kind of seems to be
01:11:00
Speaker
fitting things together. It's new AWS technology and we're going to fit it with this thing and we're going to do this and we're going to, you know, but
01:11:09
Speaker
I don't know that that says anything about the community, except that that lost me and that lost my excitement that I don't, um, I came to closure as a compiler and programming nerd and that spark seems to no longer be in the community. Um, and maybe that's part of, it's just more mature now. Um, yeah, that's, that's why I said what I did on Twitter and that is, you know, I,
01:11:35
Speaker
I look elsewhere now for for development, like cool projects to work on or new ideas or that sort of thing. And let me be clear, too, part of that is just like my ideas for what I want to write in a programming language are completely impossible in the JVM. The reason I haven't written this successor to Pixie yet is because I can't find a platform that supports the set of features that I need. And so it's like, how am I going to get, you know, delimited continuations and all this stuff into this language or that language?
01:12:05
Speaker
Yeah. But the whole AWS thing, I think that, so this is the contrast between the community versus the company, right? Because then I think the, from the company point of view, I can imagine that they're targeting specific demographic, which is going to, and I'll give the revenue, obviously. But from the language point of view,
01:12:31
Speaker
I think one of the complaints, as I understood probably, is the lack of community participation. I don't want to go that far saying that it is like a discouragement or something, but this is the argument that I keep hearing from a lot of smart people who are in the community.
01:12:48
Speaker
So is this something that do you think is going to be detrimental for closure or is this something that is also giving you kind of a bad feeling about closure?
01:13:02
Speaker
Well, yeah, I mean, I think it'll be interesting to see what's interesting to see where that ends up. It's a the problem is is that it's never a fast thing. It's not like I mean, any any person who says, you know, if I leave the closure community or I leave this community, it's going to die. Like, no, no, it's not. You know, I've I've I've I don't think that's the case. But but over time, if you see the exodus of
01:13:29
Speaker
of people and those people aren't replaced then then we'll you know we may see a problem and so and so to make it more concrete that's that's what i see concerning about the closure community right now is that we've gone from two conferences to one
01:13:46
Speaker
Um, I, I don't know the numbers at the conference, but they haven't been increasing his size. When I, when I've been, been in the past, in the past, they've kind of more or less the same size, maybe a little smaller. Last I heard, I don't think there's a Euro closure, um, a plan that I, that, not that I know of. And some of that's been picked up by the European community and that that's great. Um, and so these are all things that in isolation are no big deal, but one at a time I do question.
01:14:16
Speaker
you know, what's the lifetime? What's the lifetime are we looking at? You know, I don't think closure is going away tomorrow or anytime after that, but are we going to see a point where the enterprises are the only places that are using it? I mean, I've seen that kind of change over time as well in that, you know, you always get the startups, the ones that are willing to take the risk and then the larger companies and larger companies. And there's some large corporations that use closure now.
01:14:45
Speaker
Um, but those are also the, those also tend to be the tail end, right? They're the last ones to pick up the new technology. So is it going to be the situation where the startup stop using it all together and then you have the few enterprises and then they move on and then, you know, 10 years from now or something that's.
01:15:03
Speaker
That's that's it. Yeah. I don't know. But I think I think what I was going to say before, too, it's part of building a community that you have to the community adapts to the point of view of the the leadership. And I think you see this a lot. So I enjoy watching Twitch. It's this, you know, streaming service where people play video games and watch. Yeah.
01:15:26
Speaker
And it's fascinating to me because there's some streamers who go into their stream and they're my age, 35, they got multiple kids, they have been streaming for five or six years, highly professional, right? Someone says something in their chat room that's derogatory and they just say, yeah, that person's banned. We don't talk like that around here. And over time, the whole community becomes very positive. It's this sort of
01:15:49
Speaker
Everyone's encouraging someone makes a derogatory comment about someone and it's like hey, let's be positive Let's encourage each other, you know And you contrast that with the you know I'm gonna generalize here the the 18 year old kid who's shooting off his mouth all the time and and you know those are the ones that go on to some other streamer and just harass them or say the worst things or you know, it's this type of thing that the
01:16:13
Speaker
the energy, you have to put an energy to be an example of how you want people in the community to act, in other words. And I think maybe that's what we're lacking in the closure community. It's maybe okay to say, yeah, it's for experts.
01:16:34
Speaker
Or that sort of thing. And it's tiring. Believe me, I've been in the place of trying to mentor people and encourage them and that sort of thing. And it is tiring. It's tiring to run an open source project where you have to explain for the 50th time why you didn't write it this way or that way. I get that. But if you ever stop,
01:16:55
Speaker
If you ever say, I don't want to explain it to you. I'm not that someone said that, but if you give them the impression, I don't want to explain this to you because it would take too much time. Um,
01:17:06
Speaker
But I think we did cross the critical boundary, right, to have the longevity of the language. I think I always tend to see, I don't participate in Twitter that much because, you know, it's a very shitty platform. You're a better man than me. I'm basically there to do marketing, you know, and then I need to tweet something.
01:17:27
Speaker
And I'm so careful on saying anything these days. It's like, I don't want to get into discussion because I don't have time to type the 280 characters and then come up with a better way because English is not my native language. The other day I was just thinking, every team has Scrum Master. So how many Scrum slaves do we have? I'm like, okay, I shouldn't tweet about this shit now.
01:17:47
Speaker
that level of shitty discussion that I can't tolerate. So anyway, so I was thinking like, you know, we did cross the the boundary of or the big hurdle of having a huge community around it. But I think the stewardship is still still with the with the with the core. So I think maybe if that becomes a bit more open,
01:18:15
Speaker
For every one person who is tweeting shit out there, there'll be probably a thousand people who are not on Twitter and happily enjoying stuff. So that's the ratio, I would think. I would hope so. And that's really the question here, right? There's no way to know. I haven't told that.
01:18:36
Speaker
That came up, right? Someone's like, you know, why, why are you, why are you crapping on the closure development process? There's hundreds of people that, that work towards this. And it's like, but there's not, I mean, I, I actually ran get stats on, on closure, you know, and there's, there's six commits, six, six people have committed to closure in the past year. And many of those were tickets that were submitted years ago. Um, and.
01:19:03
Speaker
And i see that's that's the that's the problem with the discussion is is and we don't really have a good way of having a meta discussion about this no i don't i don't hold any ill will towards anyone involved but it's a self reinforcing problem the progress of closure is so slow because there's not enough people working on it.
01:19:21
Speaker
And there's not enough people working on it because the process is such that it bottlenecks, right? That you have a very limited bandwidth of how patches get approved and submitted and that sort of thing. And so the question, the problem I came into is, okay, I write a patch, I submit it, I then wait for it to be looked at.
01:19:41
Speaker
maybe a week, maybe a month or whatever for someone to say, yes, no. Okay. That's, that's good. That's bad. Once that's done, I have to, uh, and someone says, no, you need to fix this thing. I have to get that message, go back, fix it, get back in the frame of mind of what was I doing? And the whole process is not optimized for my benefit as the contributor. If you, if you read the contribution project, uh, process for closure, it's optimized for the, uh,
01:20:09
Speaker
Head developer of the project right the way patches are created The way they're accepted the tracking system everything I found myself spending sometimes, you know, if there's a small typo I spend more time creating the patch and the acceptable way than actually doing work and that just discourages It discourages involvement because the question for me becomes would I rather? Spend my time
01:20:38
Speaker
Some of it's going to be wasted, building a patch and getting in and doing all this, or something where I can have more of an impact. Yeah, well, Tim, I think there's two discussions here. We've talked about meta descriptions, so I think it's a good one. I mean, to me, if there was a sort of process which was, yeah, you need to correct your typo, correct this, correct that. Do you patch like this? Do you patch like that?
01:21:01
Speaker
But that process was like one or two days, you know, and the feedback loop was pretty quick. You're still in the frame, you've still made the thing, and then you get it accepted because, you know, because you've done everything that's been asked of you, then you get it accepted, and then a week later you see, oh, it's merged. Fucking awesome. And I did that to CausaScript, by the way.
01:21:25
Speaker
A few weeks ago, I've never done any committing to closure or closure script, and I picked up a very tiny, tiny little problem. I mean, it was pathetic. It really was. But I thought I'd write, I'll go through the process of setting it all up. And it was nice. I mean, I don't like making all these patches and stuff like that. But the people who were doing it, who were reviewing it, were very supportive, very straightforward, and
01:21:49
Speaker
they corrected me on a few things and even my code was wrong and I'm like, well, maybe you should do it like this. Wow, okay. Basically, they should have written it. But now, I've got my name on a very tiny bit of Closure Script code in their test suite.
01:22:06
Speaker
Great, you know, I feel good about it, you know, and I'll probably do a bit more. And that process of picking up the ticket, doing it, took me like a day, you know, to get through all that process. And then maybe a couple of days to go backwards and forwards to get this. And again, it was totally trivial.
01:22:23
Speaker
But it was always active. It was always like the feedback loop was always there. I never felt like it had been dropped into a kind of chasm. And I think that's what I find a little bit. When I look at the Jira tickets on the closure side of that Jira, I'm like, this patch has been there for five fucking years. Someone says it doesn't work anymore. Everyone's lost interest.

Project Change Frustrations

01:22:45
Speaker
Why should I pick it up, make some changes to it, prove that it works with 1.10, for example, or prove that it fails and make some corrections for 1.10, and then let it wait another five years? I mean, I just, I'm all for optimizations in some way, you know? But for fuck's sake, things have to get through. That's not optimal anymore. If nothing ever gets through, it's not optimal.
01:23:09
Speaker
Yeah, it's fascinating because I was thinking a lot about this the past couple of weeks. I realized, wait a minute. I do this every day at work. Yeah. The mentality, the mentality on most projects in the professional space is you create the PR, you throw it up there. If someone doesn't comment on it in like three days, someone's probably just going to auto launch it. Yeah, exactly.
01:23:30
Speaker
You know, apparently no one cared enough, you know, and, and there's, and there is that, that, you know, slap the hand thing where it's like, whoa, you committed something to master and it broke the build, you know, so, so don't do that.

Optimizing Project Processes

01:23:41
Speaker
But, but on the other hand, if I commit something to master and it doesn't break the building, it was bad. Maybe we need better tests or maybe, you know, development, this or that or the other, you know, and, and that, that, that really is, uh, that that's why, what I say, what I said before is that it's optimized.
01:23:59
Speaker
not for the contributor. It's optimized for the time of the head maintainer of the project. I mean, see the thing is I look at like two ways of this. I mean, I think there's kind of like, there's bugs and features, you know? I mean, to me, like the way that bugs get fixed should be in a different channel to the way that features get introduced.
01:24:27
Speaker
Because I know this is a blurred line, but hear me out. In general, there's a column A and a column B. And I get that there's sometimes a little bit of both. But generally, if you've got a bug, you've got a test case, you can prove it. And I've got a line that can fix it, or two lines, or whatever. And I'm not changing the behavior of this thing apart from making it work as advertised. Then that should go through pretty quickly.
01:24:54
Speaker
Because all you're doing is fixing people's code. You're just making life better for everyone, because they've got that feature. So that, to me, should be like, why the fuck doesn't that take five minutes to get fixed? And then if someone was a feature, somebody wants to write some library, like Core Instinct or whatever, or some instrumentation for it, some extra things.

Openness in Design Philosophy

01:25:17
Speaker
Then there's a discussion there about that you know that's a more wide ranging and maybe is that is like well i don't know if this is right if it's with the in the spirit of language or whatever. What have that discussion in the open because then people can get used to what that spirit is to what what is what are the what are the designs the design.
01:25:38
Speaker
What's the phrase? The design... Fuck. The design taste. There's a phrase for it. What are the patterns? The design philosophy. The design patterns. What are people looking... What's the principles on which you're basing this thing? Because then I can write my code to match those principles, to match that philosophy, or I can just say, fuck it. But for people that want to agree with you and say, okay, actually, I like your design taste. That's what I was looking for.
01:26:07
Speaker
I like your taste and design, and many people like witches taste and design. But if it was a bit more obvious, like, well, yeah, I don't like this bit of code or that bit of code. I'm not very happy about that style of way of doing things or that way of doing things. If that was a bit more open, then my thinking is that you get more people contributing more stuff, if indeed you wanted to bottleneck it that way.
01:26:30
Speaker
know, if you don't want to bottleneck it that way, if you want to just say, make it free, which is the way he wants to do it, obviously. So let's do it that way, then let's be just more open about it. Because then people will do what you want. Yeah, and I think the the thing that has been said several times, but I think the thing people don't realize is that
01:26:52
Speaker
That's not the way closure is. Let's back up and say, for any of us, if we go and we spend a big chunk of our lives writing a programming language, writing a project, what is the incentive to listen to what anyone else has to say about it?

Creator Vision vs User Feedback

01:27:13
Speaker
Well, can I say something about that? Because you can write what you want. If you've got five people using it,
01:27:19
Speaker
Fuck them, you know, they're your fans. So, you know, you can listen to them if you want to, but you can just say, look, guys, I know what I'm doing and I'm doing it this way. So, you know, it's my time. Fuck off. I'm just, I'm just carrying, I'm just muscling through. And what you say, yeah, might give a shit, might not do. You've got 10,000 people using it. It's not like that anymore.
01:27:42
Speaker
Sorry, you were going to say that probably. No, it's true. What you said is very true, but there's still no reason why they can't continue to exist in that way. As long as you're okay with the long-term consequences and the long-term consequences is that some people, like myself, will say, closure is a fantastic language and I'll use it at work, but I'm past it now.
01:28:12
Speaker
And I don't think that had to be the case. I think I would still be working on Core Async now if I, well, hey, was allowed to work on it as far as design. I mean, I could make improvements, bug fixes or whatever, but that's the extent of what I'm more or less allowed to do. But see, that's...
01:28:33
Speaker
That's the thing is if your goal is to write a language that you are happy using that helps you with your work, then there's absolutely nothing wrong with running it the way the closure is. Yeah, of course. But it's not a long-term solution, I don't think, because there comes a point where people move on. And once the people move on, they're not going to ask their companies to use it. They're not going to
01:28:59
Speaker
You know, yeah, yeah, yeah. Support and, but concretely, I mean, do you think it's, it's a, it's a good idea as a community to, to ask for something like, you know, Scala improvement process or, you know, a PEP or something like that?
01:29:15
Speaker
Yeah, I mean, it's I'm just I'm just wondering, because I think it's like we can we can keep on identifying the problem multiple times. I don't know if any of the I'm pretty sure at least one or two persons from Cognitect will will listen to this stuff. So I'm just wondering, is this something that will help the you know, the people contributing to the to the ideas and the designs and the stuff like that?
01:29:44
Speaker
Because as Ray was pointing out, you know, there are like two areas like bug fixes and then feature stuff. The feature stuff could be like the closure improvement process or closure enhancement process, whatever we want to call it. I don't care. You know, the more community input, for example. Yeah, I think so. This is a fascinating thing here in that I've met two types of people in the open source community. And and I want to make this fairly clear. I'm not talking about Rich or Stu when I say this, but I have heard people say
01:30:14
Speaker
you know, why do I care what the community thinks? Because they don't understand the project like I do, or they're just a bunch of whiny people that want whatever feature, or even the bugs features thing, right? I've heard it said by people, you know, one person's feature, and one person's bug is another person's feature, right? It's like, and you run to that problem enclosure too, right? The fact that that set union
01:30:41
Speaker
enclosure takes vectors and actually does something. Is that a feature or a bug, right? I could go in, I could fix all of that, but does that maintain backwards compatibility? I don't know. And that kind of goes back to what my thing was about building communities is, is I've also worked in other communities. I really recommend, there's a talk by Peter Hitchens back in, is that his name? I believe that's his name. Leader of the Oz, yes, I always use the wrong last name there.
01:31:11
Speaker
But ZeroMQ, the leader of ZeroMQ community from back in 2013, and his philosophy around this sort of thing was doing what I said for PRs, but doing that for features, right? If someone wants to add a feature and everyone's like, yeah, sounds good.
01:31:31
Speaker
And that's just where the conversation ends. Put the feature in, you know, or, or they say, you know, um, uh, this is a feature I never use. I'm going to yank it out in this poll request and it sits there for a month. And then it, or even a day or two and it gets auto merged. Right. And then someone says, wait a minute. That was, I use that feature or maybe they don't, you know, the fact is, is that what if closure 11 or, you know, one, one dot 11 doesn't have feature X.
01:31:57
Speaker
that you use. So the question is, do you upgrade your code or do you just not upgrade that version of closure? I mean, that's there's nothing that says that feature has to always exist, you know. But I've that that whole mentality is is so foreign to some circles of of open source development, that idea that that we would just accept features that other people want or, you know,
01:32:28
Speaker
They can go get their own project, let them fork it, build their own thing. Of course. I think just before we go on, I think there is a fetishization around backward compatibility.

Backward Compatibility vs Scaling

01:32:43
Speaker
I respect that actually because I think people are writing their code and
01:32:49
Speaker
I think it's a bit weird to break it. So I think that the ZeroMQ is definitely an extreme way for sure. Yes, yes. It's an interesting way. I think from his perspective, it was a way to scale the project.
01:33:06
Speaker
Because how are you going to scale it? I think everyone agrees this with every bit of software in the world is that there's only a certain amount of time and space in the world. And only people have only got so many hours in the day. So you need to have, if you're going to increase, if you're going to have more features, if you're going to have better features, you need to increase the bandwidth of that project.
01:33:34
Speaker
That's the only way to do it. Or at least maintain that bandwidth and not have it brought down. So the question is, how much bandwidth do you want? Because that's a nub you can turn. That bandwidth can go off or down. That's the question. That's the question I think we're at right now in the closure community, is how much do we need? I mean, I hear people say,
01:33:58
Speaker
My coworker said it four years ago. I can't think of a feature that I want that Closure doesn't have. Yeah, that's valid. On the other hand, hey, it'd be nice if all of Closure Core was specced. That's a lot of work that isn't being done, and we could probably crowdsource something along those lines.
01:34:23
Speaker
Some of it would be, my guess as well is a lot of that stuff, probably 80% of it, 90% of it would be uninteresting. It would be kind of just boring stuff. No one would really argue about it.
01:34:38
Speaker
But there would be 20%, which was actually, yeah, I'm not quite sure how this works, like the case you gave. But a lot of it is just boring stuff. It's just grunt work, basically, for one of the better threads, that the community could easily pick up. And then, yeah, you'd have some arguments, so you could put a pin in the ones that were more tricky. But that's how you increase the bandwidth. You don't necessarily have to
01:35:08
Speaker
decrease the quality, you can say, okay, I'm going to take 80% of this stuff and make an improvement over these things. Actually, you find out where the problems are that way, I think. Having that discussion.
01:35:21
Speaker
Yeah. Well, yeah. And I think, I think what that requires, like you said, mentorship requires a choirs. Unfortunately, what it requires is the person with the authority to do the mentorship because, uh, you know, Hey, I worked in Corey sink for a long time and I can't tell you how many questions I answered of why doesn't Corey sink do X.
01:35:42
Speaker
Well, I asked Rich about this in such and such. There's a problem there in that to some, and sometimes I don't agree with it. Maybe I agree with the person, but they asked why it doesn't have it. And the answer is, is that Rich has a camera argument of some point, right? But that also introduces latency, right? If every time I ask a question of Alex, he has to ask Rich or whatever, that slows the process down.
01:36:11
Speaker
And obviously, that's great. He's going to get cranky about that, but he has to fix it. Why doesn't he just say, okay, I'm only going to take three questions a day and only really super important ones. The other ones, let's just move forward.
01:36:25
Speaker
Well, right. Right. So, so that's, that's what I mean by the optimization process is that, is that I, as a, I, as a person, uh, who wants to contribute to closure has no way, have no way currently of talking to the person who makes the decisions about her.
01:36:42
Speaker
Yeah, there's no devolution. Basically, there's no there's no trust. There's no what there's no, I mean, there is trust, obviously, but but that trust is very, very tightly controlled, you know, of who actually does all this kind of committing to the core. Right. Any question I ask, any question I ask is filtered through through Alex and then and then perhaps to Rich or maybe Alex can answer my question or whatever. Right. But but there's there's signal loss. It's playing the telephone game and that sort of thing. And so, yeah, if you if you
01:37:10
Speaker
And so so it's an optimized process where now, yes, Rich can focus on the things that are important. The cost of that, I believe, is then you you lose people that maybe would have want to wanted to help. Yeah.
01:37:27
Speaker
So I think how many hours have you spent in Skyrim so far?

Skyrim Anecdote

01:37:33
Speaker
Yeah, I looked that up actually just the other day. It's like over 400 somewhere there. Oh, cool. Because I started playing Skyrim on Xbox mostly and then to a point where, you know, like my wife started managing my
01:37:45
Speaker
skydiving life. I had to ask her, you know, oh, where is this, I don't know, Ebony, whatever, the thingy. And she's like, oh, that's in Whiterun, in Whiterun, in this, in this, in this, the upper level and in the drawer. So we started managing the whole shit. And I'm like, okay, this is the time I should stop now.
01:38:04
Speaker
One of the best games ever. It's very much if for those who ever care It is fascinating to me that the entire game is basically developed on top of a database so all the items all the locations have a record based database and People have backward engineered it and in Pascal of all languages Delphi So there's like no JS wrappers for that library and stuff So the cool thing is is you have programmatic wrappers to the database format so you can add a
01:38:33
Speaker
Add-ons and mods of the game that say hey, let's go through and you know give every bandit in the entire game a health potion So he lives longer and things like that and the other thing that's done. It's this is on PC There's this thing called SKSE which is written by a bunch of developers who won't give out their names because they work for gaming companies, but they reverse engineered the executable so they found out where all the C++ classes are in the game and
01:38:58
Speaker
and installed hooks. So they extended the in-game scripting engine with hooks. It's basically a virus. It loads up SKSE, it loads up Skyrim with all its hooks in it. And you can do all sorts of crazy stuff. Of course, once you get to like 200 mods or 400 mods, it gets to be a mess. So there's a thing called mod organizer that adds another layer on that.
01:39:20
Speaker
And it reroutes all the file system commands in the game. So whenever the Skyrim says, go get this texture from disk, it reroutes it through mod organizer and it says, oh, that's in this folder here. So it keeps all of your mods in a very organized place and then rearranges the file structure.
01:39:38
Speaker
I'm not at that level. And then there's another thing called ENB that hooks into the graphics driver and rewrites all the graphics routines to have better graphics. So the three of those things together are the core of any modded setup. And now you know why I have no time for you.
01:39:56
Speaker
So your Emacs is Skyrim. Yeah, exactly. It's like I hear people talk about Emacs and it's like, wow, I can tell to you about the about form IDs and how those are ordered and the effect of, you know, why that guy in the in that game at that position has a different color ahead than a body. I can tell you that. But I don't have time to tell you about Emacs or learn. I think we should have a Skyrim episode. Yeah.
01:40:22
Speaker
It's fantastic.

Closure Language Trajectories

01:40:24
Speaker
An 11 year old game that continues to have new improvements done every day by modders. It's something like 200,000 mods out there. Holy shit. Wow. Maybe that's where closure will end up then.
01:40:40
Speaker
I think everybody would have started patching everything. That's the thing that makes Rich break out in a sweat at night. I have actually thought of combining SKSE with a ruffle, so I can have a closer ruffle into Skyrim.
01:40:58
Speaker
But that is going to happen pretty soon with Arcadia and all that shit. Try to win once we have that one, then you have a ruffle and then you can do whatever you like. Crazy stuff. That'll be super fun. Okay. Anyway, so of course, you're spending all the day writing Closure, Closure stuff still. So what are maybe a couple of things that you actually enjoy within Closure? Yeah, yeah, absolutely. So I mean, Closure continues to be just
01:41:25
Speaker
The thing that continues to astound me about it is how fast it is. I write some database code.
01:41:34
Speaker
that does optimizations, logic engines, stuff like that. And you write some optimization pass that's like, oh, and something or other. And then you run it and you're like, oh, it's going to be slow. And it completes in 23 milliseconds or whatever. And you're like, OK, that works. And that thing uses multi-methods and allocates memory all over the place. And it's just shocking to me how fast Closure is for what it does. Almost every enterprise place I've worked at, I've looked at code and I'm like,
01:42:03
Speaker
This is this is slow. This is going to be terrible. And then you look at your web response times and it's like 200 milliseconds and and no one cares at that point. You know, it's fast. Yeah. So in that sense, closure is a is a huge success in the sense that it's a language that I want to use every day at work. Yeah.
01:42:26
Speaker
And a lot of that is due to closures of pragmatism. The fact that it runs on the JVM and doesn't try to do silly stuff like make everything a go macro. Yeah. So you had that one. You had that PR band, did you?
01:42:44
Speaker
Well, no, no. I mean, so, so like, like erjang is Erlang on the, uh, the JVM. The funny thing is Erjang is actually faster than Erlang. At least it was last I checked. Um, just because the JVM is that fast, but they do full program. It's like JRuby was like, that wasn't actually.
01:43:00
Speaker
Yeah, now Pulsar is a library that does that for closure. So as you load your classes, it rewrites them using a Go macro type thing. So your entire program is, your entire, yeah, your entire service is written in that style and it's transparent and that's great. But of course you have, you know, trade-offs. Trade-offs, of course, of course. Yeah. Okay.
01:43:24
Speaker
Um, I think we had almost one hour, 45 minutes. Wow. So that's a pretty good, yeah. Just, it's good that the clocks went back today in Europe because why is it would be, of course, your biological clocks are still probably like stuck on the other times. I have no idea. So I think we, I think I still feel like it's nine o'clock or something. Yeah. Yeah.
01:43:48
Speaker
Anyway, so any, um, of course, Tim, when you have to come back, because we're probably not, there are a lot of other topics that we want to discuss and, um, especially with the logic programming stuff. Any, any concluding, uh, thoughts? Yeah. Um, that's a good, that's a good question.
01:44:08
Speaker
Apart from everybody should play Skyrim. Yeah. Play Skyrim. No, yeah, it's I guess keep learning, you know, about four years ago, I read a paper on Minicandran, Mucandran, you know, the logic programming stuff, and that's
01:44:28
Speaker
I've done so much work on that in the past few years of professional, you know, if you don't understand the philosophy I've always had in in computer programming is if you don't understand something, that's fine. But if you want to pick a time and just decide you're going to become an expert or at least become dangerous in that.
01:44:47
Speaker
At one point, I didn't understand how networking works, protocols, wiring, IP addresses, and all that. And I focused on that for a year. And at the end of the year, I helped the school I was at, I helped rewrite the whole network system. And we switched to a NAT-based system and multiple subnets and VPNs or VLANs and that stuff. And hey, it was enough to be dangerous with, right? Same thing with logic engines. No one can get in that machine in that school anymore now.
01:45:17
Speaker
Yeah, right. No, but if you don't understand closure or a list or something, sit down and just give it a year or something. Maybe not that much. I think what's nice is that, because I don't know if we're accused of it or proud of it, but this is a very closure-focused podcast.

Exploring Programming Paradigms

01:45:42
Speaker
And it's all about closure and the community and kind of like the programming language and the people and all that kind of stuff. And I think that's really good. But it is nice also to kind of explore what's outside of that echo chamber and have different things that you say. Elixir is an interesting language. Fsharp is an interesting language.
01:46:09
Speaker
There are plenty of languages out there that are kind of interesting. I got into OCaml myself recently, and I'm not going to leave the bar with OCaml. I've still got my man Squeeze, but I like to look occasionally. I'm sorry.
01:46:28
Speaker
Yeah, I think it's got some deficiencies, some things I don't like about it, but there's certainly some nice things, some advantages for different, for smaller platforms. So it's nice to talk to people like you who've got such a breadth of experience in programming languages and core skills like that to understand some of the benefits of looking around and also some of the different perspectives that are around in this huge world of programming.
01:46:59
Speaker
Yeah, absolutely. I have a fascination for programming semantics, dataflow programming, logic programming, constraint programming. There's all these different ways, not just imperative and declarative, but there's dynamic and static and untyped languages and the list goes on.
01:47:19
Speaker
There's a what is the Wikipedia thing? I think it's just programming, um, uh, paradigms, uh, is the Wikipedia article. I think it is. But on the left, on the right side of the, yeah, on the right side of the page, there's just a list of like 20 different programming paradigms, object oriented functional constraint and start reading them, you know, just like, what is a constraint based program? And what does, what does that mean? How's that different from a logic program? It has a different from, you know, and.
01:47:49
Speaker
Yeah, there's lots of stuff out there and it's fascinating. Like you said, there is a certain amount of an echo chamber in any language. There's this philosophy in languages that if your language doesn't have a way of talking about something, you have a problem understanding it.
01:48:10
Speaker
And so they've done some experience with this, and depending who you believe it, it's real or not. But in some cultures, they would call a certain color of green and a certain color of blue the same color. And that we see the difference in Westernized cultures, but in these other cultures, they
01:48:30
Speaker
That's the same color. But they had like 30 variants of blue that we can never see. And so as I look at that in programming, it's the same sort of thing. It's like if a function is just a procedure call, you're not even going to think about higher order programming or even be able to reason in that concept. So yeah, it behooves us to think about things that we could do if we weren't limited by the JVM or limited by
01:49:06
Speaker
Okay. So, um, again, I think, uh, it's time to conclude for this episode, at least, uh, thanks a lot, Tim, for all the, for all the hard work and co-racing and all of the things. And especially, I think, uh, I'm not sure if you're going to, uh, produce more, uh, closure, learning closure. Yeah. Yeah. I've done some videos of that in the past and I probably probably will in the future. I'm not quite sure we'll go with that, but, uh, so what was the URL
01:49:26
Speaker
any language or specific platform.

Closure Educational Content

01:49:31
Speaker
again? Sorry. Yeah. Um, for the videos and stuff I do.
01:49:35
Speaker
Yeah. So the site I'm on is called PivotShare. It's kind of like YouTube, but they have a better subscription model than YouTube. So it's tbaldridge.pivotshare.com. Actually, just close your videos, I think, too. Yeah, close your videos by... Videos.com, I think. Let me make sure that is the domain. Yeah, that's it. Yeah. So close your videos.com. Yeah. So check that out. And then, of course, I mean,
01:50:01
Speaker
Hopefully you will continue working in closure more and more. Oh, I'm sure. Yeah. I mean, and then one day. Yeah, it's a closure is not going anywhere soon. I'm not, you know, I'm not going to switch languages any sooner. It's a, you know, of course. I mean, you compared it, you compared it earlier to some sort of dating situation, right? It's, it's, it's, it's not a, it's not a, you know, um, one of violent break up thing. It's just, it's like, what could have been?
01:50:33
Speaker
It is not you, it is Pixie, but it's okay. But I, you know, that's fun research projects. That's something that I think it was Craig Ondara was a coworker of mine who I think he was the one who brought this up. That's, you know, the importance, it was either him or Mike Fogus, which one of them talked about, you know, the importance of like,
01:50:56
Speaker
of just having ideas, prototype the idea, throw it away. You don't have to finish something to learn. Exactly. Yeah.
01:51:06
Speaker
Okay, so on that bombshell, we'll try to conclude this two-hour episode. Probably, I think we'll make it two episodes. I'm not sure. Then obviously, we'll reach 42. There you go. Yes. Finally. I got the holy number, yes. Maybe it's 41A and 41B. I'm sorry to spoil the party.
01:51:29
Speaker
I will just skip quality completely until Tim comes back again and then we'll say, oh, okay. There we go.
01:51:38
Speaker
Okay, that's it from us for today. Fantastic discussion again. We'll see you around on the internet, Tim. Hopefully you'll come back and then give us more of your wisdom. Well, it's coming back. I don't know about wisdom. All right. Stay on the line, Carl. We're going to stop the YouTube. Stay on the line. All right. Sounds good. Bye-bye.
01:52:27
Speaker
you