Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
#42 -António Monteiro image

#42 -António Monteiro

defn
Avatar
59 Plays7 years ago
Links of Note: * https://github.com/anmonteiro/lumo * António's talk on ReasonML https://www.youtube.com/watch?v=XLnWMfdbQEo * http://unikernel.org/ * https://reasonml.github.io/ * https://zeit.co/ Serverless Platform
Transcript

Introduction and Episode Setup

00:00:15
Speaker
Welcome to DEFIN, episode number 42, the meaning of everything with Antonio all the way from Portugal, which is from a tiny place called Sesimbra. Is that right? That's correct. Yes. Perfect. So this is Vijay from Holland. And Raymond from Belgium. So let's get started. Welcome, Antonio. Hey, it's very nice to be here.
00:00:40
Speaker
Nice to meet you. Did you listen to any of our episodes? I have, actually. Please confess. Yes, I did. Not all of them. I have an entire backlog on my podcast application, but I was thinking about this yesterday, and I think I've listened to a majority of the episodes.
00:00:58
Speaker
Wow. Which means more than 50 percent, I suppose. Yeah. Depends on how you define majority. So you are the one who... We can go with 50 percent, yeah. We basically have to get some sort of an actual college going here.
00:01:14
Speaker
So let's get to the topics of

Patreon Support and Podcast Journey

00:01:18
Speaker
the day. Before we go with the topics, Vijay, can we just do the quick Patreon thing? Yes, please. Because we said last week that we were going to try and shout out to all of the guys that are doing the Patreon.
00:01:31
Speaker
with us and We did a few call outs last week, and I want to make a few more call outs this week because it really is fantastic, you know that people are prepared to put you know a few dollars or a dollar every month into the pot and So, you know, we really appreciate it. So the people this month about to say hello to and give a shout out to our Bork dude, Mr. Michael Borkant
00:01:58
Speaker
over there in Holland, good friend of the show, Dieter Comandara, Dan Boykis and Jürgen Herzl. Okay, so thank you very much, we appreciate it, we appreciate your support and let's hope we can get from 42 to 69.
00:02:19
Speaker
Yes. I think you might be setting yourself up for failure if you keep saying all these foreign names yourself. Oh, wow. Who should say them for me? I don't know. Maybe patrons could record their names so they come across clear. Because I'd probably butcher them myself too, is what I'm saying.
00:02:47
Speaker
I think it's part of the deal, you know, if you pay money, you get your name butchered on air. Oh, wow. Oh, yeah. I mean, that makes sense. I mean, you're not paying anything and you're getting your name butchered on air. Okay, there there goes, I think, two more patrons dropping off. Yes. Yeah. Screw this. We're out of here.
00:03:13
Speaker
Why am I paying for this abuse? Anyway, but thanks a lot. And I think it's episode number 42. It's been a pretty good run so far, I think. I think I was just commenting before we recorded. We started like 1960s Batman and then the show became darker and darker.

Antonio's Journey into Clojure

00:03:34
Speaker
And then we have been making, I don't know, hopefully this episode is going to bring joy again into closure and hopefully not pushing away people into a camel, if you know what I mean. Oh, my God. So let's get started. I take that personally. So Antonio, just can you give us your journey into closure?
00:03:58
Speaker
Yeah, so I think I started doing closure back in 2013 I was working this I think a lot of people it's it's it's not uncommon to to hear a story like mine I suppose I was doing a Java Enterprise corporate Java JavaScript job and I I was losing hope and I
00:04:27
Speaker
thought there has to be something else out there that can make me regain hope for programming and I guess computers in general. And I was actually really, really amazed when I saw Closure and its design principles. But I think right out of the bat, I was trying to get started with ClosureScript.
00:04:54
Speaker
uh actually before closure and because of the experience was uh to get started with closure script back then was so rough especially for a beginner that had never touched uh the language i i i had to to go through my path had to to be through closure first and then figure out how i would make closure script work because it's interesting that i was actually more interested in closure script then but i just couldn't get it work
00:05:23
Speaker
And you're still the number three contributor in Closure Script. I have done a little bit of work, yeah. Cool. We'll come back to that. What was your reason for favoring Closure Script over Closure Antonio? I think I was... I suppose I was always a little bit more intrigued about how to put beautiful things on the web so that everybody can consume them.
00:05:54
Speaker
I think it's just a personal choice. Before I knew how, I'm not claiming I'm an expert or anything, but before I knew how to make webpages or pretty things on the web, I was always intrigued on how can there be such amazing experiences today and how do I get to the point where I
00:06:22
Speaker
where I can make them too. How do I solve all these problems of synchronizing things from remote endpoints and putting them here in a way that's really nice for users to consume? And I suppose the seeing aspect of it was always what intrigued me the most. What was your background then?
00:06:48
Speaker
I'm just a computer science, computer engineering student. That was my first job out of college when I started looking into this. Okay. But I think if you're interested in building stuff for web, I mean, usually you don't end up thinking that, okay, I'm going to pick the roughest possible experience and then start from there. Well, I was already building stuff with JavaScript. Yeah.
00:07:17
Speaker
I think I came to the conclusion that it was going to be kind of impossible for me to figure out how to build these nice experiences that I was seeing all over the place with JavaScript itself and I needed something that I could
00:07:34
Speaker
keep in my head more easily than i was with javascript and i suppose react wasn't out by then it was i think early to twenty thirteen i think react came out later that year and so it was mostly jquery and backbone and
00:07:53
Speaker
What is it called the IBM, I think dojo UI? Yeah, dojo. I think I did some work in dojo and then, you know, the Yahoo UI, ext, and then that turned into ext.js and turned into Sencha and, you know, it was like a mess and also Backbone and other crap.

ClojureScript Contributions and Challenges

00:08:13
Speaker
Okay, so maybe before we get into other libraries that you built, like Lumo, for example, can you give us some idea about your contribution to ClojureScript and your impression of the language and your opinion about it?
00:08:33
Speaker
I think I came to, or I didn't start contributing to the ClosureScript compiler right away. I think I really, really got interested in ClosureScript in 2015 when David Nolan started talking about Ohm Next.
00:08:51
Speaker
and how he proposed to solve all the synchronization and data, I suppose, data fetching, data loading aspects of UIs. And I think at that point, I was just kind of obsessed of, how does this thing work?
00:09:15
Speaker
Where's he getting all these ideas from? Because it can be kind of overwhelming when David is on his... I think he has those building modes when he's like deep down in the weeds just...
00:09:31
Speaker
building stuff, something that he's, you know, has this idea that he sees a way forward. But it's kind of hard for people outside to, you know, just like everything else to see what people have in their heads. And I think at that point, I really got obsessed into trying to understand what he saw there. And so I think that's how I started contributing to the Close Script ecosystem was through OmNEXT.
00:09:57
Speaker
and building a lot of example UIs in Omnic and see how all that worked. And then I think I kind of graduated to the ClojureScript compiler once I saw, you know, I just want to help out and saw some bugs that I thought were easy enough for a beginner to the code base to contribute to. And then
00:10:24
Speaker
And then my, I suppose we can call it mission, inside the ClearScript code base was,
00:10:32
Speaker
trying to bridge the gap between ClosureScript on its, let alone, really nice island and the rest of the ecosystem, be it either Closure or the wider JavaScript or NPM world. And so I think I actually got started with bridging some gaps between the ClosureScript and Closure languages
00:10:56
Speaker
In a way that, so I think Closure Script didn't support something like rename in namespace declarations.
00:11:06
Speaker
It also didn't allow you to use some other interrupt things that Clojure had. And so I suppose that my first objective out of that was making sure that the reader conditionals were increasingly less necessary. And then I got started of like, okay, so this was really fun, but I kind of lost some interest.
00:11:35
Speaker
And I didn't see the point of... I don't think I saw the point of doing it anymore. And then I started looking out further into the NPM and JavaScript ecosystem. And this was after Maria Geller made her contributions in her Google Summer Closure project where... Google Summer Code.
00:12:04
Speaker
Because the Google Closure Compiler has support for parsing these JavaScript dependencies that are not exactly in the Google Closure format and outputting a Google Closure format for them.
00:12:18
Speaker
And we started looking into how we could make that support that was alpha at the time even better and make it so that you could just NPM install any library and be able to consume it in Closure Script 2. And then a lot of my work was
00:12:38
Speaker
You know the water work that was doing was with respect to npm dependencies and how to consume these dependencies that were not made for the close the google closure compiler and then just seamlessly enter up with them in closure script just just like if you were calling any closure script namespace.
00:12:58
Speaker
And that turned out to be really, really harder than we expected. There are still some rough edges. But what were the challenges there?
00:13:13
Speaker
And so I think the first one and the more obvious one is that JavaScript is a very dynamic language. And the way you write JavaScript for the Google Closure Compiler to consume, because the Google Closure Compiler is this whole program optimizer that needs to see all of it in order to make assumptions and optimize and aggressively inline stuff and also remove code that is unused.
00:13:38
Speaker
And for it to be able to do those kind of optimizations, it needs to know a lot about your program and your JavaScript needs to be written in this very static manner, which I would argue is not how the majority of JavaScript developers structure their code. And so things like adding
00:14:00
Speaker
Adding variables to a module exports at runtime is something that is impossible to consume, for example, in a static analysis kind of way, which is basically what the Google Closure Compiler does. And the way it transpiles, or I suppose that's a word, it transpiles, say, a common JS module to a
00:14:27
Speaker
Google Closure Compiler Module or Google Closure Library Module is by inspecting all the exports and their dependencies and outputting this static thing. But it doesn't really do any code evaluation there. So things like that are very dynamic, such as adding variables at runtime, it can't do.
00:14:49
Speaker
I suppose it would be interesting now that I think about it to combine it with some kind of partial evaluator like Prepack. I'm not sure if you've heard of Prepack. Prepack is this tool by Facebook which tries to
00:15:06
Speaker
It gets your bundle and tries to partially evaluate some parts of it, such that when it outputs the resulting bundle, a lot of things are in line. So it can, for example, unfold.
00:15:23
Speaker
Some for loops that are or I guess the the the example that they have on their website is you have this Fibonacci function and you call the Fibonacci with Five and they just inline the number. I think 120 in your code. So what it's doing is is partial evaluation and I suppose
00:15:44
Speaker
This strategy could maybe be combined with the Google Closure Compiler so that they could figure out some by just evaluating the top level things of a module and figure out what does this export and how can we then compile it down to a Google Closure Library format. But we're nowhere near that.
00:16:06
Speaker
So before coming to Closure, so I'll go to the script. No, no, no. Don't worry about it. I mean, this entire podcast is a tangent. So before coming to Closure script, you said you are only working with Java. So is this your first functional
00:16:30
Speaker
Oh yeah. ClosureScript was, or I suppose something that I deployed to production was my first contact with a functional language. I had done some OCaml and we can come back to that in a second. I had done some OCaml in college.
00:16:47
Speaker
But at the time... Okay, so if we're going off on tangents, here's another one. The reason why I got back into OCaml was actually ReasonML, the project by Facebook. That's a nice pun. It is both a blessing and a curse. It's really easy to make some unintended puns with that name.
00:17:13
Speaker
But anyway, so I got back into OCaml because when I did it in college, I think you can say that my professor at the time made a really, really nice job to make sure that none of us would ever touch it again in the way that it was taught to us. And like everybody just hated the way how it was presented to us without any context. So I think that was my introduction to functional programming actually.
00:17:45
Speaker
But Okemel is statically typed in a language, right? It is, yes. Yeah, I had never done any Lisp before encountering Clojure.
00:17:54
Speaker
Nice. So you said that so after some time you stopped or you reduced your contribution to a closer script. And is it because of some other reasons or? There you go. See what I was talking about? Of course, a dark night again. We have to. We have to. Unfortunately, let's see. I don't know. There is a light at the end of the tunnel. Yeah. So I think
00:18:24
Speaker
It came, I think it came a point where I just, I'm this huge nerd that's always looking for new and interesting stuff to learn. And at some point I found out about ReasonML and how Facebook was trying to put a, you know, I think you can call it a sugarcoat on OCaml.
00:18:47
Speaker
with the reason syntax. And this was even when I was contributing to Glue Script Compiler, but then again, it was one of those things where if you live at the cutting edge of something, you have to expect to get broken tool chains and broken setups all the time. So at the time, I wasn't really ready to dive into Reason yet because everything was really a pain to set up.
00:19:14
Speaker
But this is this is essentially your thing, right? I mean, the first time. Oh, yeah, exactly. I suppose it is a shitty experience. And then you look for where else is there more shitty experience? Where's something that's going to break my entire setup? And I suppose I got really interested in the reason and OCaml ecosystem there.
00:19:38
Speaker
because it was for me a new way to think about how do we structure or how I structure my programs in a statically type or not statically type because I had done Java and I guess Golang before in college.
00:19:56
Speaker
But a language with a strong type system that really tracks the entire flow of your program and if you make a mistake over there, there's something that the compiler knows that you don't know or you hadn't realized yet that tells you,
00:20:15
Speaker
Well, you can't really make that change over here without changing it over there where you're using it. And that was kind of enlightening for me and why I started digging into that even more.
00:20:29
Speaker
But before even getting into the reason and more stuff, can you give us some behind the scenes stuff about Lumo and how it came to be? Yeah, I'd love to.

Creating and Optimizing Lumo

00:20:41
Speaker
I suppose we can establish a timeline here. Yes. It's the retelling of the history. Exactly.
00:20:51
Speaker
So the timeline of my experience with Closure would go back to early 2013, as I mentioned before. And then I think I started contributing to Ohm Next in maybe middle 2015. And also I think late 2015, I started contributing to Closure Compiler.
00:21:16
Speaker
Sorry, closed script compiler. And then at the end of the summer of 2016, I had my master's thesis to write and I was looking for other things to do because I really didn't want to do it.
00:21:31
Speaker
And basically, that's how Lumo came to be. Lumo was just a displacement behavior. It was a whole exercise of procrastination. How to avoid doing the important things. Because Mike Fleichs at the time, he had come up with Plunk, which predates Lumo by more than a couple of months.
00:21:51
Speaker
I started you know something that also really interested me at the time was the optional self hosting ability of the closure script compiler and the way that you could just. Run closure script and compile it without the need for JVM because.
00:22:12
Speaker
knowing a little bit about the the the javascript ecosystem too i knew that people were not keen on installing a jvm and so if we're going to bring more people to closer script we have to provide a an option for them to to to get started or at least you know kind of a gateway drug to the jvm which would be
00:22:35
Speaker
I thought, and I still think it's a really viable alternative, getting through too close a script through a completely self-hosted environment. And so I saw Plunk by Mike and I was really fascinated how fast it could start up.
00:22:53
Speaker
It was the startup wars, wasn't it? It was, yes, it was. It was the startup wars because starting a closure repl was, a bare closure repl is like one second and probably even more now with closure spec.
00:23:10
Speaker
And starting up a line project at the time, I think, was like six seconds, just with the tooling and machinery that involved. And Plunk was like this, I think, 600 millisecond startup time. I was fascinated on how fast you could have something that you could just type something into a REPL so fast.
00:23:32
Speaker
And Mike, he was really clever and made some other optimizations where he would show you a prompt without even having loaded all the compiler stuff. Take it and de-load it. Yeah, so you could just start typing and the time that you took to write something and click enter to evaluate it was a time that the compiler was loading in the background. Yeah. And so I think at the time it was like, well,
00:24:03
Speaker
I'm really experienced in the Node.js ecosystem, and Mike did this thing that runs on top of JavaScript core. What if we had something that ran on top of V8, the Google JavaScript engine, and Node.js, and that could have access to any package in the MPM ecosystem?
00:24:26
Speaker
And because it's running in Node, we don't have the problem of consuming libraries and needing to convert them from the common JS module type to the Google Closure Library module type, because we're already self-hosting inside this Node environment. So everything is already understood, because we're not really
00:24:53
Speaker
We don't really need to ship a bundle to somewhere else.
00:24:58
Speaker
And another goal was also, well, how can I be faster than plaque? Of course. And so I had a rough prototype, because getting started with the ClojureScript self-hosted compiler is incredibly easy. And David Nolan and Mike Fikes had put a bunch of examples out there that was, for me, that someone who knew ClojureScript at the time, it was easy for me to get started.
00:25:27
Speaker
And so I started looking into how I could make it work. And after, I think, a couple of days, I had a prototype running. But I had this compiled JavaScript bundle that I would run in Node. And I had to figure out, so how do I package this in a single binary executable that, just like Planck, so I can distribute it?
00:25:52
Speaker
And I was still not happy with it because it wasn't fast enough. So I think Planck was, as I mentioned, 600 milliseconds and I was at about 900 milliseconds sort of time. And I was like, this is not good enough. I need to do better.
00:26:11
Speaker
So Mike actually, I think he deserves most of the credit for me being faster than Plunk because he mentioned to me, well, if you're running on V8, there's this little known thing that is really cool about V8 that are startup snapshots. And so what startup snapshots are is when you're compiling V8 and compiling Node.js,
00:26:35
Speaker
You can tell V8 to load this JavaScript file that you give it. It will load it in memory, deserialize the heap into a bytecode file, and include it as part of the binary. And whenever it starts up, it just needs to deserialize that bytecode
00:26:55
Speaker
into memory and you have the same state that you had when you loaded the file at compilation time. So basically what happens when you start up Lumo is V8 behind the scenes is doing this deserialization of the heap. So you're not really parsing and executing any JavaScript, which made it at a time start up in like 150 milliseconds.
00:27:20
Speaker
Wow. And so then I was happy.
00:27:28
Speaker
Super cool. But what is the next step? Before we go into that, though, just stick with them up for a second, because I think you got some adoption for scripting environments, didn't you? Because that's the other nice thing about things like Lumo and Plank, especially Lumo, I think, because the other thing I remember about Lumo, which was quite interesting, was the fact that you could run on Linux
00:27:52
Speaker
as well as mac and you can run windows and so it was like a cuz cuz you're following no essential exactly so i do not deserve any credit for any of that because i'm just piggy piggybacking on what does is where you get credit for doing it's okay.
00:28:12
Speaker
It's incredible to me how we're always standing on top of the shoulders of giants, especially because V8 is this platform where billions and billions of dollars have been poured into. And I can just take that
00:28:29
Speaker
use it to make some people like this niche crowd that is into weird list languages happy with something that is the result of billions of dollars. And I think that's why I'm saying those people do deserve a lot of credit because I just made up work with this little bracket syntax that we like over here.
00:28:56
Speaker
That's what you want i mean that's that there is a whole point about javascript was rich and i think in general everyone knows this is that you know,
00:29:04
Speaker
Java rocks, but JavaScript reaches, you know, and so the fact that you made it reach, you know, you made closer script reach on the command line to different platforms. I thought that was pretty important. That was pretty good. Yeah. Windows support was actually, you know, Windows support is generally something that is really hard for, or at least something that is disregarded
00:29:29
Speaker
for a lot of people that are doing developer tooling. Windows support was one of the goals, one of the primary goals for Lumo was I'm not going to ship this until it works on Windows. Even though I'm not a Windows user myself, I still wanted people who were running Windows to have the option to have Lumo available to them.
00:29:53
Speaker
You know how it goes whenever the 90% of the time is spent doing the last 10% of the work. And that was Windows support for Lumo. Well, isn't it turning around in the way now, though? Let me just say one more thing, which is I really need to
00:30:18
Speaker
you know, publicly thank the Berlin Closure User Group because that's where I initially announced Lumo. I contacted, I think, Paulos at the time who was running the meetup and I asked him, well, can I come speak? I have something really cool to show. I want to talk about self-hosted Closure Script. So you're kind of really bearing the lead there.
00:30:47
Speaker
It was a really cool day. I announced Lumo at the meetup and it got incredible reception. And those people are really, really nice and they're an incredible group of people. So, you know, thanks.
00:31:00
Speaker
Yeah, of course. We have, I think, the highest following in Germany, so we are very popular there. Practically big celebrities in Germany. Wow. I mean, I was there a couple of weeks ago. I didn't get any paparazzi, but I'm supposing they're hiding somewhere. I see, I see. But anyway, it's really cool that I was actually asking about the future of Lumo. Where do you see it?
00:31:25
Speaker
Yeah. Maybe before even we get there, I think we kept talking about Lumo. What is the elevator pitch? What exactly is Lumo? Maybe just say one thing about it. So Lumo, the way I tell it is because there's still one missing piece that I need to talk about. But the elevator pitch is basically Lumo is this fast cloth platform, closer script environment that you can have
00:31:53
Speaker
You know a one click it's one click away from from from you to install it and get closure script running Anywhere and so all you need to do is NPM install NPM install. Yeah, I suppose that's G LUMO dash CLJS in and you're off to the races Awesome. That's awesome. That's an awesome beginner experience. I mean, that's gotta be said, you know, it's And also I
00:32:21
Speaker
It's incredible how like you put something out there and then people start using it in all sorts of ways that you never anticipated and so. This my friend victor from from from Sweden he's running this company called ploxa and they're doing they're doing.
00:32:40
Speaker
They're doing, I really don't want to get this wrong. So I think they're doing a smart device for people who are chronically ill to take their medication. Because if you've just come off of an organ transplant and you miss like two pills in a row, you might just reject the organ now, right? And so they have this smart
00:33:11
Speaker
Pillbox that is connected to your phone somehow probably bluetooth or Wi-Fi I'm not sure and you get a notification every time you gotta take a pill.
00:33:22
Speaker
And so he mentioned to me that the way he was building the firmware for the smart pillbox was, I'm not sure if he's still using it, but initially he put a Lumo script and that's how he was doing it. And I was like, that's amazing. And like, I never imagined that Lumo could be used for that.
00:33:45
Speaker
And other people are orchestrating CI builds with Lumo and that's also really cool. And there's, you know, some more use cases that are also cool. So Juxt in London, they were developing this replacement for Make and Make files using Loaderscript. I call it Mach. And using Lumo.
00:34:06
Speaker
Nice. And so the missing piece that I was talking about is the reason why I described Lumo as a closure script environment and not just a rep hole is because besides all the scripting abilities and also this was the second step in
00:34:28
Speaker
After that phase was done, I wanted to see if I could be the first
00:34:39
Speaker
tool to provide an actual ClojureScript compilation without the JVM. So one thing that I think I did, so Luma was released in November 2016 and I think in February 2017 or March, I wrote a blog post
00:35:01
Speaker
about compiling Closure projects without the JVM. And I basically re-implement it. And then, again, in the topic of standing on the shoulders of giants, the Google Closure compiler team used GWT, the Google Web Toolkit, to compile the Google Closure compiler, which is written in Java, and they compiled it to JavaScript, and they put a version of the Google Closure compiler on NPM.
00:35:30
Speaker
And so once that was available, then I thought, I do think that we have 100% all the pieces that are necessary to make a ClojureScript workflow exclusively on Node we thought ever needing to start up a JVM.
00:35:49
Speaker
And I set out to do just that. And I accomplished that. And as I said, I think February or March 2017, where you could have a ClosureScript project that is obviously self-hosted compatible, so you can't really use any JVM, macro, machinery. And you can compile that ClosureScript project
00:36:17
Speaker
and get a Google Closure Compiler advanced optimized aggressively inline bundle just like you would with the JVM. And I thought at the time, well, this was really cool. Yeah, it is still. Yeah.
00:36:38
Speaker
Yeah. Well, unfortunately, and I think as you might expect, a lot of people are not using that simply because most of the libraries out there in one way or the other use some JVM libraries in Macros, even for ClojureScript. So the number of libraries that are self-hosted compatible are still, I think, in minority.
00:37:03
Speaker
There might be people using it, and if you are, please reach out. I want to know about it. I don't think I know anyone who's using Lumo for this use case. But is the macro support the only missing piece that is going to stop people from not using this?
00:37:30
Speaker
By macro support, I want to clarify that macros are still supported in self hosted ClojureScript. You just cannot use any Java standard libraries in macros just like you would in ClojureScript.
00:37:44
Speaker
And I think taking a step back is macros in ClosureScript are defined in Closure because the Closure compiler is written in Closure and it expands your macros in Closure as part of the analysis process. And so you can't use macros that take advantage of Java tooling and libraries in an environment where there is no JVM running.
00:38:09
Speaker
But you can still write macros in self-hosted ClojureScript. So that is one of the limitations. I suppose another one could be that while the JVM is optimized for throughput, you could say that generally JavaScript engines are optimized for latency and time to first
00:38:35
Speaker
execution or something. But they might not be as fast and they're generally not as fast as Java simply because Java has all the information about types at compile time so it can make all the necessary decisions.
00:38:52
Speaker
And the consequence of that is that compiling Closure Script projects with Lumo, and especially with the Google Closure Compiler on JavaScript, takes a longer time than compiling the same project on the JVM.
00:39:08
Speaker
So I've seen a very small project. Compiling a very small project with advanced optimizations in Lumo take like 60 seconds, whereas it would take maybe 15 seconds in the JVM. Okay. So what's next for Lumo? So right now,
00:39:37
Speaker
I think a couple of weeks ago I released a version of Lumo that is up to date with the latest version of the ClojureScript compiler. So it's running the latest ClojureScript compiler but I have officially stopped working on it and I would love if people who want to see Lumo keep getting updated and
00:40:01
Speaker
And you know squash some bugs that it certainly has to come forward and reach out to me and say that they want to be a maintainer and I would gladly talk to them and coach them and mentor them.
00:40:17
Speaker
through the LUMO code base and build process, because I have lost interest. For me, the things that I set out to do are done, which is the startup timing, the closed-script compilation, and now anybody that would
00:40:39
Speaker
basically want to carry the flag forward. I would be more than happy to welcome them into the project and help them do that. Yeah.
00:40:53
Speaker
That's really good. Just because you're starting an open source project doesn't mean you have to own it for life. Yeah, exactly. I certainly don't want to see that. I want people to keep using it. I'm just not going to be the one maintaining it anymore.
00:41:09
Speaker
Yeah, yeah, yeah, yeah. But how challenging is it to maintain it? Or do you have like a kind of a vision or a roadmap for it in the next, I don't know, six months, one year, two years or so? I do think it's been pretty stable since its beginning, and with the exception of the compilation workflow. So the last thing we were talking about. Yeah.
00:41:37
Speaker
which was alpha for some time and then I... Andrea Ricciardi and I made some more improvements to it.
00:41:50
Speaker
And I think it's now pretty stable. It might not be up-to-date with the latest Globescript compiler, or the way it does things. But other than that, for the workflows that people have been using Lumo for, it's been very, very stable. And so I think that the roadmap for that is just...
00:42:12
Speaker
keeping up-to-date with ClosureScript, fixing some bugs which generally appear at the boundary of the interop with Node.js, and keeping also the Node.js version up-to-date so that people can use the latest Node.js features. Yeah, yeah. Makes sense. That's fantastic. So from Lumo to Nextstep, where are you right now?

Exploring OCaml and Functional Paradigms

00:42:41
Speaker
Right now, I'm really interested in working with OCaml. I got into OCaml through the ReasonML project. That's why I started and still contribute to the ReasonML parts and pre-printer, the entire toolchain.
00:43:04
Speaker
Even though I'm mostly using ocamel syntax nowadays. And the reason for that is because I see it as a really viable alternative to get people started into liking functional programming and strongly type functional programming, which is something that I really have come to appreciate. And so that's the area where I'm most active now and more interested in.
00:43:32
Speaker
So are you going to go work for a Jane Street consulting audience? Well, you never know. I don't really see myself working for Jane Street, but who knows? I'm just trying to think of any other company that
00:43:49
Speaker
I think Facebook is using OCaml a lot for their internal infrastructure. ReasonML is actually powering more than 80% of messenger.com now on the front end.
00:44:04
Speaker
So if you use Facebook and Messenger on the web, you're using an OCaml project. The thing I found interesting about OCaml in the past few years, which makes me think it's a very interesting language as well, is the uni-kernel stuff that's been coming out of the Mirage project and also was now in all the kind of native sort of Docker stuff.
00:44:32
Speaker
So what kind of use cases are you looking at at the moment? Obviously we're talking about a big range of possibilities here. So what kind of things are you getting excited about with OCaml? So the reason why I got into OCaml and I was trying it out for a little while, and then the thing that really got me hooked was that I saw an OCaml
00:44:58
Speaker
the thing that I saw in Clojure when I initially got started, which is a language in which all the design decisions have been thought about and have been carefully considered for a long time. And I saw it as a really pragmatic language
00:45:23
Speaker
in terms of usability and the features that it provided and still does in a way that I, you know, maybe this is ignorance from my part as well, but that I don't see in other languages currently. Maybe I'm just unaware of.
00:45:41
Speaker
And that's really what got me hooked into OCaml. Did you compare it with Haskell or some other kind of classic MLs? Yes, I did. Actually, before looking into OCaml, I started looking into Haskell and I read the
00:45:57
Speaker
Learn you Haskell for greater good. I think that's that's how the book goes and What the idea that I got from it was that you know, it's it's it's a hundred percent. Let's let's say The the analogy that I like to use is Haskell is a hundred percent language Whereas OCaml would be an 80 percent language
00:46:21
Speaker
And what this means, or what I want to make it mean, is that Haskell doesn't allow you to do any side effects.
00:46:34
Speaker
unless it can control it. You need to explicitly tell the compiler that, you know, ask for permission to do a side effect, then thread all the side effects through the, you know, let's, okay, here it comes, the M word, through the IO monad.
00:46:54
Speaker
Whereas in OCaml, the category theory is still there, but you don't really refer to any of the concepts through the theoretical nomenclature, if you will. So you never say that, oh, I'm using the whatever monad or the bind function in the monad or whatever.
00:47:20
Speaker
And another thing that it gives you is because it's a strictly evaluated language, as opposed to lazily evaluated language like Haskell, Okemel lets you intermingle side effects with your code. And so in a regular function, you can just put a statement that is printing to the
00:47:45
Speaker
to the standard output or you can write to a file or make a network request or what have you.
00:47:54
Speaker
And so that still goes through the type system, right? So in a function that returns an integer, for example, you can't return a side of it, right? You can't print, you can't return console log or whatever in a function that's supposed to return an integer, but it can say that at the top of the function, you print to standard output and then you return the integer.
00:48:18
Speaker
So it still goes through the type checker, but it allows you to have these escape hatches, which I think are not only necessary, but mandatory and the ways that we need to structure our programs today.
00:48:37
Speaker
So obviously, saying that comparing with the type system is a bit kind of a weird comparison because you can't compare the same thing with Clojure. But if you pick the other parts of Clojure, for example, like better way of the whole immutability or concurrency, those kind of things and availability of the libraries. So how do you compare that with the OCaml language or ecosystem compared to Clojure? Yeah.
00:49:06
Speaker
I think they're both pragmatic in their own way, which is the conclusion that I came to. And so Closure is really nice because, or the thing that since the beginning really amazed me and got me fascinated about Closure was
00:49:24
Speaker
the possibility of describing your data through these built-in lirals that are never changed. So if you have a vector with one element and you let the air quotes here, but if you add one element to it, you're not really doing that. You're saying the result of this operation is a new data structure.
00:49:52
Speaker
Uh, that has, you know, all of these internal optimizations to make it share memory and all that, but it's a new thing. And that kind of thing, I don't think I had seen before and, and really.
00:50:06
Speaker
uh maybe interested in the way in in like i really wanted to write my programs like that because i i saw something really cool there something that i had never thought about and when i and i would say that is the pragmatism that i'm talking about and so when i found out about ocamel it also lets you
00:50:26
Speaker
It doesn't have built-in immutable data structure by default, but it does not. But it has a number of other things that I was looking for at the time. Say, in Clojure, I really got tired of writing my code, running it, and then trying it out, and then it blew up in my face. Like, oh my God, I'm so dumb.
00:50:56
Speaker
I made a typo or I cannot get this right. In OCaml, I like almost like having a little buddy touching me in the shoulder whenever I make a mistake.
00:51:12
Speaker
Hey, you can't really run this program without fixing this whole thing right here, because I'm not allowed to segfault, or whatever the equivalent would be for front-end. And so it really pokes you in the shoulder.
00:51:32
Speaker
and tells you, please go fix that mistake or I will refuse to run. And for me, that was incredibly helpful because I found out that the things that I want to program are not things that I can keep in my head at all times. And having something, offloading all that work from my brain to the machine,
00:52:03
Speaker
made me think of the problem that I was trying to solve instead of worrying about making these mistakes and having all these cognitive overload of things that I need to get right, not in terms of syntax, because you can argue lisps don't have any syntax, but in terms of
00:52:24
Speaker
What are the fields that these data structures have? What can I put in this map? How can I make this functional transformation operation over this data structure?
00:52:37
Speaker
In OCaml, I'm finding that whenever I want to do that, I can just hover over any variable, and my editor will tell me, this is the exact set of fields that this JD structure has, and these are the only operations that you're allowed to make.
00:53:00
Speaker
on these data structures, and these are the only ways that you can transform these data structures, because you can't really access a field X in something that only has a Y, if that makes sense.
00:53:15
Speaker
Yeah, I mean, but this is pretty much standard for all the static languages, right? I mean, you have to some extent Java, and I did a lot of Scala as well. I wrote a lot of Scala before. I would say that Scala is also strongly typed, but not Java. Yeah, of course, yeah.
00:53:30
Speaker
Yeah, well, Java has a bit more static typing than Clojure, right? A bit more so you have much more support from the IDE, for example, you know, having the hovering shit or whatever. Yeah, exactly. So the way I would contrast that is that types in Golang or types in C or Java are there
00:53:53
Speaker
And it's mandatory that you write them down. They are there so that you can tell the compiler how to optimize the generated machine code for your use case. Whereas in OCaml or Haskell, you don't really need to write down your types. You can if you want. But the types are there for the compiler to help you structure your program. It's kind of a different flow of information.
00:54:21
Speaker
Yeah, exactly. And also, the development experience is a bit different, right? Because in Clojure, in Lisp, you'll be more like reppling constantly and then trying to write one function and then, okay, get it run, that's running nice, and then move on to the next step. But in the case of Haskell or Scala, for example, that I'm used to a bit, it's not the same experience. I mean, it's like the whole thing needs to compile or not.
00:54:48
Speaker
What is it? Right compile run cycle? Yeah, exactly. You know, something that also annoyed me is, and maybe this is going off on a tangent again, is Clojure's startup time. Because the whole, oh, the whole REPL thing is really, really cool until, but I would argue that is really,
00:55:14
Speaker
It doesn't serve a lot of use cases or at least it didn't serve mine. The way I generally write software is I'm constantly switching between get branches to say I'm implementing this feature and then I hit a roadblock and I'm waiting on someone from the business side to give me an answer and so I switch to another branch where I have another feature.
00:55:36
Speaker
And I found out that in my setup, I couldn't really keep my REPL open at all times. So in get branches where you say remove a file or you add a new file, and I think those are the most problematic cases, I would have to constantly keep restarting my REPL. And that also added a lot of overhead to my
00:56:05
Speaker
to my development and I found out that those kinds of things that cause you to switch contexts are the things that are most unhelpful to your development and to your process or for my mental process at least.
00:56:25
Speaker
And how about the libraries and the tools and ecosystem and deployment and debugging? Because we keep talking about language level stuff. Of course, I understand language is the baseline for everything. But most of the time, the challenges are integrating with external system. Most of the time, the challenges are, how do I deploy this shit? How do I monitor this thing? Those kind of things. So how is that on the other side of the pond, so to speak?

Unikernels and Future Tech

00:56:51
Speaker
Yeah. Oh, yeah. Right. So Ray was asking about unit kernels and deployment scenarios. That is also something that I...
00:57:00
Speaker
I'm really fascinated by. I must say that I have only played with UNI kernels this one time and for a little bit. And I really didn't get it deployed to any hypervisor. I only got it running on my computer. I think UNI kernels are really cool. And there might be something in the future. They're just not there yet because the development overhead is still significant.
00:57:27
Speaker
because you know choosing the packages exactly yes so so whenever you deploy your say a server to to to a cloud provider you you you can just you know use time in your logs or log to disk or have them you know send
00:57:48
Speaker
socket packets, or send packets via a socket. In a unicronal scenario, you need to think about that way ahead of time and say, okay, now I want to use the time library, so I need to include that in my library operating system.
00:58:08
Speaker
And then, oh, I'm also going to need to use the disk, so I need to include the disk driver. Because there's really no OS kernel in a scenario like that. You need to include all these, I suppose they're called kernel modules. And I think there's significant
00:58:37
Speaker
I say business value to be created there if someone makes that whole process much easier. I think people are working on that, aren't they? Sorry, go ahead. I think people are working on that. Yeah, I think I'm aware of one or two companies are working on it.
00:58:53
Speaker
The reason, the interesting thing about uni kernels as I understand it, and the thing that interests me about it is that you can't log into the machine. That's the huge advantage. So you could, when you, leaving aside the kind of smallness, you could basically just drop off. If you just take out the login shells, then you kind of,
00:59:19
Speaker
you're in the spirit, you know, you're in exactly, you know, in your most optimal unit kernel, but you're certainly, you know, in the spirit of what is good about a unit kernel, you know, ie that it's, you can't log into it, it's an, it's a really is a kind of application appliance type environment, and not a general purpose operating system anymore. Exactly. And, okay, so I'm gonna add to that and I'm gonna
00:59:43
Speaker
I'm gonna add to that by going off on another tangent. Do it. Please, go ahead. Which seems to be the topic today. It's always the topic. Tangential discussions, topic of the day. Number one vegetarian closure podcast going off on tangents. Of course.
01:00:05
Speaker
Thinking about deployment scenarios for the OCaml programs that I work on, there's always the traditional Docker setup where you build your project in there, you deploy to Kubernetes or you deploy to a standalone server somewhere that you don't care about it.
01:00:25
Speaker
But a paradigm that I've been, I think, as of recently, been closely monitoring is the whole serverless paradigm. And you mentioned that uni kernels as library operating systems, they take away the whole login shell and out of the equation. So you can say that a whole lot of security issues are mitigated by that.
01:00:55
Speaker
Another really interesting thing about uni kernels that I see is because they don't have to include this very heavyweight Linux kernel in your program and you're only selecting the kernel modules that you're running directly on top of the cloud hypervisor.
01:01:19
Speaker
You can just basically have a uni-kernel startup for every request that comes in, which makes scaling problems just go away very, very easily. So if you have something that starts up in 20 to 50 milliseconds, you can afford to pay that price. I'm not saying for every request, but for every 10 requests that come in,
01:01:46
Speaker
You degrade the experience of one request by 20 milliseconds or 50 milliseconds, and you can just spawn and kill, I was going to say machines, but you can spawn or kill Unikernels.
01:02:07
Speaker
in a way that add a granularity that is not simply not in existence right now, except in the functions as a service kind of scenario. They're definitely cheating.
01:02:25
Speaker
Yeah, exactly. And so this is where I think the tangent begins. I've been looking at this service called Now by this company called Zeit. So it's ZEIT.co. And they have this service and they just launched their 2.0 infrastructure. Yeah, 2.0 thing, yes.
01:02:47
Speaker
which is basically embracing Lambdas and AWS Lambdas, so functions as a service all over. And so in a Lambda context, the way your, I think, code is run is... I think at least in AWS is that they...
01:03:09
Speaker
One Lambda is never invoked concurrently. And so if you have two people making a request to your server at the same time, they're going to be served by two different Lambdas. And so that's something that is really cool which also removes the
01:03:38
Speaker
The scalability problem in some ways, it's like any other engineering decision. It's a trade-off because now you're forced to structure your program in another way. And so something that I recently built over the past week actually was this... So they have the concept of this thing called builders. And a builder is something that
01:04:09
Speaker
builds the code that you upload to their platform and generates a lambda. So I built this lambda that allows you to run any statically compiled binary in their lambda context, so their infrastructure.
01:04:25
Speaker
And so I've recently made an experiment of running a statically compiled OCaml server in 80 plus lambdas through the ZytheNow platform. And so that's kind of cheating as well. I understand that just a little bit.
01:04:47
Speaker
So it uses Node.js to start up this process then? It does not. It uses Golang. Oh, yeah, OK. Well, because I didn't want to pay the price of starting up Node.js. Go just starts in 10 milliseconds or less. Node.js is like 60 or 80. Yeah. So if I'm going to introduce any kind of overhead, at least it should be a little. OK. So it's the Go binary starting the OCaml binary.
01:05:16
Speaker
The Go binary, yes. The Go lambda is starting the OCaml server.
01:05:23
Speaker
But Lambda also, so I mentioned that you can't have a Lambda serve two requests concurrently. However, what you can do is you can, or this is entirely managed by the cloud provider, but what they do is I think AWS keeps your Lambda context hot for 15 minutes.
01:05:47
Speaker
So if you have two requests per second, right, you're always only going to have two lambdas for 15 minutes, at which point I think they die and then two others come up. Because OCaml static binaries, they start up really, really fast. You only pay the price of starting this server on the first invocation. And I could imagine a scenario where you have this cron job that runs
01:06:17
Speaker
say every second is just making autocast to keep the Lambda hot or something. Yeah. And yeah, that is something that I've been really excited about recently. But this is a completely different way of building applications, right? It is. Because we are so much... I looked into Zite before. They were like version one thing as well, you know, like cross cloud providing things and everything.
01:06:40
Speaker
I mean, it requires basically, I don't know, re-architecting the way that what you think is an application is because, you know, in traditional ways that, okay, I have a database or I have something, I have a front end or whatever, I'm going to deploy these things. But how, coming back to your point that, you know,
01:07:02
Speaker
The cognitive load of you know coming from closure to oh There is a nice programming language and then static system that is helping me in understanding the system it doesn't that the same complexity apply to these kind of things because there is Shit is like split into a million lambdas and then now I need to think about how to interact between all this crap Yeah, I would agree that
01:07:26
Speaker
Yeah, because I was looking at, of course, I built some Lambda functions and for especially for the ETL sort of jobs and everything. And at some point I'm like, okay, I have to switch to step functions because there is no easy way to to orchestrate all this crap. The data flows in.
01:07:41
Speaker
Yeah, exactly. So what kind of applications do you think are going to be, of course, I'm not going to, you know, say that this is all crap, you know, like it's not usable for anything. Right. But what kind of mental shift we need to use platforms like Zite or Zite? I don't know what you call that. Yeah, I don't think I can or I'm most definitely certain that I can predict the future.
01:08:08
Speaker
yeah no damn it you know this is episode number 42 come on yeah give us the answers but but i i think we we've seen a lot of paradigm shifts and a lot of shifts in mentality over the i mean that's what computer science and yeah especially computer engineering has been
01:08:32
Speaker
since its inception, is thinking of new ways to architect the application that we run. And it may be a little rough right now to think of lambdas entirely, but imagine a scenario where you do describe, say, just, you know, okay, just put a new layer of interaction in it. Yeah, of course. That solves every problem. As one does every time. Yeah.
01:09:00
Speaker
And that layer of indirection that I'm thinking about is, imagine that you're somehow writing your program in the traditional, familiar way that you used to, and then you have this build process that basically, if you're... the way you architected your application is a direct acyclic graph, for example.
01:09:21
Speaker
It takes that and it knows all the dependencies of everything in your program and it can generally generate, say, a thousand lambdas for the thing that you wrote in the traditional way, right? And I can certainly imagine that someone who solves that problem would be instantly rich.
01:09:42
Speaker
That will be rich. Data make ions. Anyway. I've been to lenders a couple of years ago, and I'm a bit bearish on them at the moment, to be totally honest. The reason I'm a bit bearish on them is because I feel personally that they're a bit like an enterprise service bus.
01:10:06
Speaker
in the sense that there is a kind of magic hand in the back of all these lambdas that is generating the events and doing the invocations. And so I don't like that. I don't like this sort of magic hand operating in the system. So this is why I'm kind of down on it. I think the concept of
01:10:28
Speaker
of highly parallelized and distributed programming is totally awesome. But the Lambdas seem to me like they're not right. So, you know. I see. Yeah, I mean, I'm also, everyone's entitled to their own opinion. I'm not saying I disagree or I agree. I actually
01:10:52
Speaker
This is actually the first time that I've been messing with lambdas. So I really can't be considered an expert in the matter. So I'm going to leave your opinion as you laid it out because I really don't have a comeback for that. I like the uni-kernel concepts though because the uni-kernels are independent and they're, you know,
01:11:20
Speaker
Yeah. The way I've been thinking about Lambda so far, as I've mentioned, I have zero to very little experience with them. It is only in the context where a Lambda is attached to what AWS calls the API gateway, which is they're invoked. Yes.
01:11:43
Speaker
Or the event that they receive is effectively an HTTP request. Yeah. And that is the only concept, the only context to which I've been exposed. So I can't really say anything else. In that situation, which I really think is a very, very useful situation, any kind of network request is a very useful, you know, you want to respond to that network request. That kind of mimics the use case of unit kernels. And that's the way I've been thinking about them.
01:12:08
Speaker
Yeah, but the nice thing about the UniColonel HTTP request is you basically get, like you said, a full stack response that's just impromptu ad hoc generated. You haven't got all these fibers to go through and all these orchestration layers and all this other shit to go through. You just deploy your stuff somewhere. You make it essentially listen on an endpoint and then the operating system just starts it on
01:12:34
Speaker
As needed and that's the only glue that you need you know essentially something listening on a network socket that can call into your system which i like to bear bones aspect of that.
01:12:47
Speaker
Yeah, maybe lambdas are a means to an end in a way that there's something that needs iteration and they're not their final form. Who knows? Maybe in the future we'll have something like what you're describing that was an iteration on how of what lambdas are now.
01:13:10
Speaker
Yeah, I mean, you know, I listen to the Mirage team and that's what they're kind of vision is. Yeah, that's definitely a very, very cool project. Yeah. Cool. So, before we, I think it's almost, yeah, almost one hour. It's passed it now. Pretty cool.

Community Expectations and Technology Reflections

01:13:29
Speaker
Okay.
01:13:32
Speaker
I think we talked a lot about Closure and then OCaml. What do you miss because you moved to OCaml from Closure? What do I miss? 100% Datomic. Awesome. I gave a talk at Closure Tray.
01:13:54
Speaker
Yeah, I bought OCaml, right? Yeah, in September. Yeah, I saw that. Small FBE. Yes, exactly. Not at Closetray, a small FPconf, which is the day before. Yes. And what I talked about was finding an OCaml
01:14:11
Speaker
what i was used to enclosure and i think the way i ended the talk is i really haven't find any suitable alternative for datomic yet even though graphql on top of the database gives you
01:14:27
Speaker
kind of sort of some part of the same experience. Yeah. Even though you really don't have the locality of, you know, locality, well, I mean, also in terms of data, but locality in terms of a database as a as part of your application, feeling like the atomic gives you.
01:14:48
Speaker
Yeah, and I I really wish that someone writes or you know, I've thought about it. Yeah, exactly probably someone else because I Think it's a hard undertaking someone writes a datomic client implementation and no Kemble that I could or even in C or C++ because you can bind to that but that you can that you can use and
01:15:13
Speaker
I think there was some effort by Mozilla people to replicate, or at least, you know, make Datomic using Rust, but it kind of died, I think. What was that project's name? Mentat, maybe? Yeah, something like that. Yeah, I think it died in the war. Yeah, but it disappeared pretty quickly, I think. Yeah, I think they killed it, but I haven't.
01:15:42
Speaker
Any final thoughts before we wrap up? I have one final thought. Since everyone seems to be entitled to their own opinion, I think I'm also entitled to mine. I think we've seen a lot of
01:16:05
Speaker
A lot of discussion recently about how the Closure Core team puts out features and how the community responds to that. And well, since I'm given the platform, I want to give my opinion on that. And my opinion on that is that
01:16:28
Speaker
I feel like there's a lot of disconnect and expectations. And so I would attribute that the recent communication and let's call it maybe flame wars, maybe not that impactful.
01:16:47
Speaker
The recent anger from the community in relation to that is in my opinion attributed to a difference in expectations in that people come to Closure expecting it to be a community-run project in an open manner and they need to
01:17:15
Speaker
Except that it's okay that Closure is a project that is run inside Cognitect, tested at Cognitect without any... Okay, that's also not fair. But with little input from the community, and that needs to be okay. And if they like Closure, they should keep using it with that in mind. I felt like I needed to say that for a while.
01:17:46
Speaker
And so I would say this is
01:17:51
Speaker
This is defending Cognitect and the team that makes Closure, but I also have the opinion that what I just said needs to be clearly communicated by the people that make Closure and think about Closure so that people don't come to the language with the expectation that they can make
01:18:18
Speaker
any impactful difference in the shape of how the language is built and iterated on.
01:18:29
Speaker
I think you put it very nicely that it's the mismatched expectations mostly. And the trouble is that the people who come and then, or the people who have been in the community for a long time already, they feel responsible for the language because they're invested in it. I think that's where the fundamental friction is coming from.
01:18:57
Speaker
Yeah, I think a part of that also is that I do not include myself in that group of people because I only came later to the language, but people that have been with Closure since its beginning. They used the language at a point where it was pre 1.0 and Rich was actively soliciting feedback from the community. And then there was a point where that stopped happening. And so I think some people
01:19:28
Speaker
still have not conformed themselves with the way that the audio is run now. That needs to happen and that's going to happen sooner or later. Yeah.
01:19:39
Speaker
I think I understand that most of the toxicity is on Twitter and everything. I don't read Twitter shit, so I don't care. I only show up there to retweet some things or talk about some things that I'm doing. It's like the worst platform ever to have any kind of reasonable discussion.
01:19:59
Speaker
Twitter is a terrible place to have opinions. Exactly. Because Twitter in Henry Lee is a platform that makes you communicate in a concise manner. And speech that is very concise is often interpreted in many ways in which it wasn't initially predicted to be.
01:20:27
Speaker
You know, a random fact of the day is when they, I think, I don't recall where I read this. And please feel free to fact check this. We've got a team on now.
01:20:45
Speaker
I read recently that one of the things, one of the consequences of increasing the, of doubling the Twitter character limit was that people started saying please and thank you more. That's nice. Because in 140 characters you just simply didn't have the light to say please or thank you. So you just demanded stuff.
01:21:09
Speaker
Yeah, I always thought that fuck you was shorter than please. Oh, thank you. So, you know, I think it works better. I think it's also, you know, the trouble here is that every now and then, because I follow maybe Stuart Holloway and Alex and a couple of other people.
01:21:25
Speaker
And if i see something from them then i have to understand the context like what the fuck are these people are talking about i don't know you know there is something happening somewhere but the interesting part is that nobody's going to nobody's going to treat like oh my god you know today i made an amazing program today nobody's gonna do that.
01:21:43
Speaker
People, yeah, I would also encourage people if, again, I'm given the platform, I would also encourage people to say good things about stuff more than they say bad things about stuff. Exactly, because there is enough bile out there. I mean, we have enough shit out there. We get it. Yeah, exactly. The world is a terrible place. We get it. The world is rolling my lawn to it.
01:22:11
Speaker
Because there is nobody saying, oh, today I deployed my project within half an hour without any problems or something. The terrible situation is that if there is, I think it's built into our evolution or something to warn each other about the impending doom rather than talking about, oh, there is a nice apple tree somewhere. No, I'm not going to tell other people because I need all my apples.
01:22:35
Speaker
But there is a fucking predator there. I'm going to shout to everybody. That's the tricky part, I think. But I'm pretty sure there are significant amount of people who are actually enjoying and then having fun with the language and okay with the way it is. You don't get me wrong. Of course, yeah. Closure is a wonderful language that serves a wide variety of use cases. And if people are
01:23:02
Speaker
comfortable with the trade-offs and the design decisions that have gone into closure.
01:23:09
Speaker
and they're comfortable with having... Because it has to be okay that some things are made by one company for everyone else. And the input that they request is not as big as other communities. And so if people are comfortable with that, they need to match their expectations
01:23:34
Speaker
to what the reality is and keep using it in a way that they want it. Yeah, I think if you compare it to, it's interesting, isn't it, to compare it to other companies which do that. So I think an interesting example to me is like comparing Clojure and F-sharp and Swift, let's say, because, you know, Microsoft are, you know,
01:24:02
Speaker
they make their things for themselves, and Apple make their things for themselves, and, you know, Cognitec make their things for themselves in the end. And other people benefit from it, and that's fine. I think the question is, like you say about, you know, what is the basis, how Swift and F-sharp and these other languages, I think, are more clear about what the basis of their change process is. And the closure is not so clear, to be honest. I mean, I think that's what I would say, the expectations,
01:24:31
Speaker
have not been set properly. They need to be set probably and clearly communicated. Because I do agree with you. Or maybe not. Or maybe not, right? But if they are, that will probably lower the volume of complaints in the manner that they have been happening in the past, I think, few months.
01:24:54
Speaker
Well, I think people like clarity, don't they? People like to understand what the situation is, because like you said, I mean, you know, expectations exist, whether they're realistic or not, you know, and it beholds, you know, I think in the end, if you're producing something, then, you know, and you do expect other people to use it, then I think you should be clear about what the limit of liability is, you know.
01:25:23
Speaker
Because, you know, if I produce a car and people drive it, but then, you know, wheels fall off, then people are like, yeah, but if you've heard, whoa, man, look, read the fine print, you know, wheels could fall off at any moment. Oh, shit. Yeah. Yeah. There needs to be some kind of fine print here. Yeah. In my opinion. Yeah. But I agree with you. I mean, it's definitely always conflict is mostly about expectation mismatch. So I think you've nailed that one. I think that's 100%. You know, we can all agree on that.
01:25:54
Speaker
Perfect. So, I think... Comedy in episode 42. Exactly, finally. We come to a stasis, you know, like a nice equilibrium. So, thanks a lot Antonio for joining us. Thank you for having me. It's been a pleasure to talk to you and hopefully somebody will pick up Lumo. It's not going to be me, I hope not. That's also fair.
01:26:23
Speaker
I don't think I'm smart enough to pick up that level of startup. I don't think it's about the smarts, really. Then I have to poke you in Portugal and then learn a couple of things from there. But it's a very mature thing to do.
01:26:39
Speaker
You build a project and you're wishing it well and then you're reaching out to give it to a, I don't know, next, whatever we call, succession. Next in line, whoever that may be. Exactly. So hopefully you will ride the camel long enough and then get back into closure or maybe share your thoughts and good luck with all the stuff.
01:27:07
Speaker
And just before we close, I have a small announcement about Dutch closure day. So we are going to this is like the fourth edition of it. And we do have a lot of fun there. I'm a co-organizer for this one. This next year, it is going to happen on 6th of April on a Saturday in the center of Amsterdam. And this is the nice thing about the community. I mean, we have 150 spots available. It's a free event. So anybody can attend.
01:27:36
Speaker
And we have around 60 tickets gone already. We didn't even announce anything yet. There's no schedule or an agenda. It's being announced right now. Run to get yours. Yes, get your free spot. And the call for proposals is open. So please submit your talk, whether it is beginner talk or
01:27:57
Speaker
If you want to take over LUMO and talk about LUMO, please, please go ahead and check out our website, closuredays.org. And this is a free conference. So we are also looking for sponsors. It's going to be very affordable conference. So we have two types of sponsorship. One is with 1500 Euro and the other one is 5000 Euro.
01:28:23
Speaker
So we call it partner and then sponsor. So if you know somebody who knows somebody or if you are interested in sponsoring, get in touch with Closure Days team. And I hope I'll see you there. And the whole Deaf and Gang will be there with Ray and Devoucher. So that's it from me. And again, thanks a lot, Antonio.
01:28:45
Speaker
Yeah, thank you, Antonio. Stay on the line. Don't jump off, even if we close the channel. Yeah, it was really great being here. And I actually have been trying to make it to Dutch Closure Day for, I think, two years, but it never happened. So I hope I'll be there this time. Yeah, no, no. Yeah, totally to the charm. Yes. Thanks a lot. That's it from us. And I think goodbye. All right.
01:29:46
Speaker
Something that didn't come up during a podcast, though, is that I'm an Emacs user, so I'm an Emacs. So, yeah, take it, Ray.