Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Testing TVs At Scale With Elixir (with Dave Lucia) image

Testing TVs At Scale With Elixir (with Dave Lucia)

Developer Voices
Avatar
0 Playsin 18 hours

Integration testing is always a tricky thing, fraught with problems setting up the right environment and attempting to control the system’s state. That’s particularly true when you’re dealing with a mix of software and hardware, and even worse when you don’t have control of what the hardware can do.

This week I’m joined by Dave Lucia of TVLab’s, who’s building systems for testing television software at scale, and it’s a problem that needs a huge variety of techniques to crack it. He’s using cameras, real time video processing, Erlang & Elixir and a host of other tools to make it possible to test a fleet of televisions on demand.

Sometimes good systems revolve around a single big idea; this time it’s a large combination of solutions, coordinated by the BEAM, that gets the job done.

--

TVLabs: https://tvlabs.ai/

Flipper Zero: https://flipperzero.one

ATSC 3.0 “NextGen TV”: https://en.wikipedia.org/wiki/ATSC_3.0

Support Developer Voices on Patreon: https://patreon.com/DeveloperVoices

Support Developer Voices on YouTube: https://www.youtube.com/@developervoices/join

Kris on Bluesky: https://bsky.app/profile/krisajenkins.bsky.social

Kris on Mastodon: http://mastodon.social/@krisajenkins

Kris on LinkedIn: https://www.linkedin.com/in/krisjenkins/

Recommended
Transcript

Modern TV Reliability Issues

00:00:00
Speaker
A few weeks ago, my television crashed. It just completely froze up. And the only thing I could do was to physically unplug it and plug it back into the wall. And in that moment of frustration, I shook my fist at the sky and said, why are televisions like that these days? TVs never used to crash. They used to be completely reliable.
00:00:21
Speaker
But in fairness, they also used to just show half a dozen channels and expect you to show up when they were ready to broadcast. Yeah, old televisions were rubbish. We're far better off in this modern world where you can watch whatever you want, whenever you want it, because your TV is now basically a computer. Even though they're less reliable because your TV is now basically a computer.

Challenges in Testing TV Software

00:00:47
Speaker
Computers bring possibilities and problems. We know this.
00:00:51
Speaker
And it's the problems we're going to look at this week. Because how do you test television software? I mean, if you thought making things work across different browsers, or across different smartphones got complicated, imagine testing across a field of 50 inch plasma TVs.

Introduction of Dave Lucia

00:01:10
Speaker
The solution to that problem is actually one of the most interesting practical engineering stories I've heard in a long time. This week I'm joined by Dave Lucia, and he works on a television software testing fleet that needs real-time video processing, Gen AI, WebRTC, some Lua scripting, and a large fault-tolerant distributed Erlang system for scheduling, and a few more things besides.
00:01:38
Speaker
Sometimes solving a problem needs one big idea. This is the story of many interesting interlocking pieces that fit together, all in the hope that in future, my television won't crash. I'm your host, Chris Jenkins, this is Developer Voices, and today's voice is Dave Lucia.
00:02:10
Speaker
Joining me today is Dave Lucia. Dave, how are you? I'm doing great. Thank you for having me on, Chris. A pleasure. You're just off the um back of organizing an Erlang conference, is that right?
00:02:22
Speaker
That's right. Not just Erlang, but Erlang, Elixir, and Gleam. It was called Code Beam Lite NYC. We had it in Brooklyn, New York this past Friday. And it was wonderful. It was 10 talks, half hour each. We had the creator of Sonic Pi, Sam Aaron on to give our keynote. and it was yeah It was really a wonderful day.
00:02:43
Speaker
That sounds really cool. ah Yeah, I'm looking for an excuse to get to New York sometime. So next year, if you need some to give someone a press party, you let me know. Well, this is wonderful marketing because now we're going to have to have you on next December. I look forward to it.
00:03:00
Speaker
OK, enough me pitching for free conference

Dave's Career Path to TV Software Development

00:03:04
Speaker
passes. the re I actually heard about you because I was looking at a similar conference over in Berlin, and you were talking about Lua on Erlang. And I thought, well, how did you get into that? And it turns out the answer is absolutely fascinating through television software. Tell me the story of how you got into TV software development.
00:03:29
Speaker
Wow. Okay. Well, it's a little bit of ah a convoluted path and it involves, ah I guess, a decade of building relationships in the technology world. But I started out my career at Bloomberg.
00:03:43
Speaker
And Bloomberg is a financial services company. You might be familiar with the Bloomberg terminal or maybe ah Bloomberg News. um I started out on the Bloomberg terminal building trading software. um And about halfway through my five-year stint at Bloomberg, I moved over to Bloomberg dot.com.
00:04:02
Speaker
And back in 2014, we rebuilt Bloomberg Businessweek and Bloomberg Politics. And it was just a really fun time. This was kind of right when React was starting to get popular. we built We built an open source framework that no one else used called Bloomberg Brisket, which was one of the first single page app frameworks in JavaScript.
00:04:24
Speaker
um Sorry, i this is convoluted, but I promise we'll get there. I believe. I met someone by the name ah of Stephen Baldwin, who was consulting for Bloomberg. And we stayed friends ah throughout the years. I had since left Bloomberg to do a bunch of media startups. I worked in sports betting for a while. um And after my most recent startup, I was looking to do something new and reconnected with my friend Stephen.
00:04:56
Speaker
And in the eight odd years that we had not been working together, he had started ah working at his consultancy with a number of different um media companies who built smart TV apps.

Creation of TV Labs

00:05:10
Speaker
And while they were building these smart TV apps, um the pandemic hit. And they were having this challenge with their their clients of, I'm building an app. I need to test it on an LG television. ah Sometimes the simulators are not so good. And so we need to make adjustments ah based on how it works on the physical hardware.
00:05:32
Speaker
And so what they started doing is they were taking their office space and they were sticking you know a bunch of TVs in it, building ah a literal TV lab. And ah you know they would have someone go in, they they point a webcam at it and someone would go in, sideload a build onto the TV and go ahead and test it. And after a certain point in time, they're running into all these challenges with making this work and realize, hmm, this could really help our clients. It seems like there's a product here.
00:06:02
Speaker
um and so Ultimately, it stars a line where um my friend Steven was looking to to take this idea and build it as its own company, as a product. and I was kind of leaving a startup that I was at for maybe about a year and looking for something new. and so We joined forces to to build what we were calling TV labs.
00:06:24
Speaker
Okay. ah can see I remember going to Google's offices years back, and this is right right in the boom the early boom of the smartphone thing, and they had walls of smartphones, like every model in a kind of library, and you could go and pick one off the shelf. yeah I can't quite imagine that working with a 48-inch plasma television.
00:06:47
Speaker
That's right. is Is that what we're talking about? like Just having to have access to every kind of television? Pretty much, yeah. So if you're a big enough media company, you probably have the budget to have a warehouse somewhere that's just filled with televisions, a TV lab where it's exactly for this purpose. You have a QA team.
00:07:10
Speaker
And before a release a release goes out, they are certifying the release internally by physically going to televisions, uploading a build, using the remote control to to test the television and make sure that everything that was built and is working correctly you know on my machine also works on the the physical LG TVs, Panasonic's, Samsung Tizen platform.
00:07:35
Speaker
But not just that, we've got Apple TVs, we have Roku's, we've got... Of course, yeah, the Fire Sticks and all the... Fire Sticks. There's so much fragmentation, Chris. Yeah, yeah this this instantly sounds worse than the multiple browser problem of the early 2000s. That's right. but Okay, so before we get into what sounds like either a nightmare or a fantastic source of QA employment,
00:08:03
Speaker
Just if we can step back, because I don't know much about development for televisions. I'd always kind of assumed they just wrapped a crummy browser, but what's, give me ah a bit of an insight into the world of this, of a television app developer. What are they working with?

Unique TV Platform APIs and Web Technologies

00:08:21
Speaker
So, luckily, most of it is web technology, and most of it is wrapping Chromium and and putting it into a television. um The problem is, which crappy browser are they putting into the television? It is the question. and you know these things i you know i I wish I had more of a depth of knowledge on you know the specifics of every platform. What I can tell you is that most of these platforms are wrapping something like Chromium or Spider Monkey. They're shipping you know a set of their own SDKs um onto the device. But then each device is going to have its own APIs for um uploading builds, getting you know different ah hardware controls on the television, you know permissions access. Oh, Because it's not so just that, like,
00:09:10
Speaker
yeah every Every television has a unique way of talking to it. And yeah, OK, sorry. Carry on. That's right. No, that's right. Every every TV has ah its own unique way of talking to it. Some televisions run the Android stack. So there's Android TVs. There's also Google TVs. There's also the you know Chromecast. There's also a new platform that I believe might have already come out or is coming out early next year. I forgot.
00:09:38
Speaker
That's built on top of Android, which is the whole you know Java stack and and all that fun stuff. And that, presumably, is a flavor of iOS. Oh, yes. They've got TVOS. Then we've got the Tizen platform, which is ah built in Java. The apps, I think, are also web apps. I'm blanking on that right now. but But the thing I think you could take away here is that every single manufacturer, for the most part, unless we're dealing with the Android-based televisions, and that's a you know decent part of the market, um all are a different fundamental platform for working and shipping on.
00:10:14
Speaker
and so You can imagine that there's frameworks out there that try to normalize you know how it standardize how you might interact with each television, but the devil is in the details. and so As you start to you know build out features and have maybe more complex interaction or even just simple things like your use of Flexbox, it might you know behave in surprising ways because it's this old version of Spider Monkey running on this one particular 2017 Samsung Tizen platform and that has 8% of your market share and that's you know hurting your your know your bottom line revenue. These problems actually matter.
00:10:53
Speaker
And so what's interesting, I think, for us is that it's not always necessarily the newer televisions that are the problem, because that's the first thing you think is, oh, you must be having to keep up with all these new televisions and new devices every day. No, the the problems that we hear from our clients the most are, can you source this very specific 2017 Panasonic television for us that's based out of Asia? or based out of South America, where this one feature only works in the UK, and we want to test that. you know can Can you make sure that we have a UK television that's physically located somewhere in the UK? These are the types of problems. yeah and i mean We're kind of getting ahead of ourselves, but do you end up scrolling through eBay looking for a particular ah model of television?
00:11:38
Speaker
ah You can imagine there's a multitude of of places where we might be procuring devices. okay okay Sometimes it's, ah you know, if we're trying to get something really fast, it's friends and family. hey Does anyone have this in their basement that we can borrow for a few days? Yeah, Craigslist and that kind of thing. That's right. Okay, so let me see. i One of the questions before we get into your solution is, how large is this market? Because I'm instantly thinking, okay Apple has to have an app for every kind of television. Netflix has to. um But how many players are there? There aren't that many television channels that have an app for every television out there. How large is your potential customer base here?
00:12:25
Speaker
I mean, it's pretty large when you when you consider it globally. So I think in you in the the tech world, in the US, we we tend to be very US-centric. But ah you know globally, everyone is consuming, ah you know whether it's television via broadcast or television via ah you know streaming, um there's many different ways to consume ah media on a television.
00:12:48
Speaker
um The market is is quite large, and there's some really big players, the the Netflixs, the Amazons, the NBCs, the Warner Brothers of the world. ah There's you know medium players. you know In Europe, you have you know Channel 4, the UK specifically, you know Channel 4, and ITV, and and all of those. ah But then there's you know South America, which has a very different market. ah So globally, we have this kind of massive market. and ah each one of them has different constraints and needs and hardware and all these different challenges. and Not all of them will have the budget of Apple or Netflix. That's right. That makes sense. okay so There I am, a medium-sized television IT t executive.
00:13:34
Speaker
who doesn't have the budget to hurt, or even perhaps the physical space to have a warehouse full of televisions to test on, certainly doesn't want to hire QA engineers who are sitting there wandering around with a USB stick and a controller. You've been trying to automate this process. Take me through how you possibly automate this problem away. OK. Well, there is

Video Output Challenges in TV Testing

00:13:59
Speaker
there's many layers to to go through here.
00:14:02
Speaker
um It might make sense actually to go bottom up rather than top down if that's OK. OK, yeah. Sure, we can build it up the stack. So let's let's start at the very, uh, at the very bottom, which is I have a television and this television, um, it has a screen. Uh, it's got HDMI inputs for, you know, plugging in different, yeah your, your PS five or maybe your, your fire TV, Apple TV, whatever it is. Uh, it doesn't have an HDMI out, right? So getting video, high quality video out of the television is not actually the easiest thing.
00:14:38
Speaker
Yes. yeah yeah i okay We're right at the very first hurdle, and that sounds horrible. What do you do? Do you crack them open and try and solder in an HDMI outboard? Well, I think you probably could. You probably could ah find a way to you know do a hardware hack and you know fiddle some wires. I'm imagining kind of like an action movie where you're trying to diffuse a bomb. Tom Cruise could do it in a reasonable timeframe. We tried to hire Tom Cruise to to hack our hardware and it just ah wasn't economically viable, let's say that. Fair enough. um you know Or maybe you got a little flipper zero here. I don't know.
00:15:21
Speaker
anyway
00:15:24
Speaker
The solution that we came up with is, well, twofold. One, there's certain devices where we do have HDMI out. And for that, an HDMI capture card, you can easily capture the stream in HD. You need to have a particular license. Not just anyone could be capturing HDMI out because there's DRM on the devices that prevents you from capturing it in high quality.
00:15:47
Speaker
So you need to be you have to have a license to be able to do this. And so for devices like set top boxes, you know, I'm talking about the fire TVs, Apple sticks of the world. I feel like I'm repeating myself a lot. We're going to be hearing about set top boxes for the next hour ah issue. My capture cards are the way that we get the video out. But for televisions, it's a little bit of ah a bigger challenge because televisions, we don't have that luxury. And so for that, we use cameras.
00:16:18
Speaker
Now, this it poses its own, in and of itself, is its own challenge of yeah how do you get the um how do you get the screen, you know which is a you know nice 1600 by 1280 rectangle. um How do you get that and flatten it? Because you know a camera's lens is always going to be slightly curved, even for the know the flattest you know camera. So you're going to have You're going to have radial distortion around the camera, right? Yeah. um You're also going to have all that really white light from the TV, saturating all of the color and distorting the image. And so if you naively did what um my co-founders did you know before we formed TV Labs, you're pointing a tv a webcam at a TV, and you've got this horrible, you know very saturated image that doesn't look very nice.
00:17:08
Speaker
not quite a rectangle around the TV, but kind of this weird distorted rectangle. um And so what we did is we found, you know, cameras that were were going to be really good for the type of of light and image that we're trying to capture. And our goal always was you should be able to stream a 4K movie and for it to show up in your browser as if it was um You didn't know the difference. It was just ship shipping your television screen right into your browser. um So we're taking cameras. We're pointing them at televisions. We're then ah taking that video stream. And we actually have four cameras on our boxes. So we're stitching them together. We're applying different undistortion techniques to ah flatten out the image, stitch it together into that rectangle. Why four cameras? Is this a resolution issue?
00:18:01
Speaker
It's not a resolution issue. We could do it with one. um And I think this is an early optimization that we made that that maybe we we would change now. But ah our goal was to always have a box, a very tiny box that we could stack many, many, many of them next to each other and on top of each other so that we can have a warehouse full of hundreds or thousands of televisions. In order to do that, you know there's a certain point where ah space becomes you know a limiting factor for how many you can have in your warehouse, unless you have a lot of vertical space. That's a challenge in of itself. So what we wanted was to get the cameras nice and close to the television. So the depth of our boxes could be very narrow. Oh, right. Yeah. So you've got for you've got to have a camera a certain distance away from the television to be able to see the whole thing, like a parent shouting at their child to sit back from the TV. That's right. and and So you've got four up close. Yeah. OK, so now I see why you've got a stitching for video streams back together problem.
00:19:04
Speaker
Right. And then we need to make all these different, ah you know, you have four video streams, but they're overlapping. They're going beyond the, you know, the edges of the television stream. And so we apply different computer vision techniques to get that back to that nice rectangle. We apply an undistortion to flatten it. ah and and color

Complexity of Automating TV Testing

00:19:27
Speaker
correct it. And then what what ends up happening is that we have this video stream that we can then strip to your browser over WebRTC that's nice and flat and you couldn't tell the difference that there were four cameras. There's no seam. It looks perfect.
00:19:42
Speaker
Do you have the luxury of that being repeatable? like Can you set those settings once for the four cameras and expect it to work with every television that you've got a box for thereafter? Or do you have to hand tweak each quint quartet?
00:19:59
Speaker
So there is some amount of hand tweaking that we did early on. ah Now we actually have a ah pretty nice process for um each one of our cameras um goes through a um ah tuning process before it even goes into a ah box. And then it's all calibrated nicely. And then when it's put into the box, it's great. But even with a calibrated camera, it's not going to be pointed. you know you know, perfectly head on to the television. There's always going to be a slight angle and the TV might not be on the mount perfectly. um So we have this calibration process where we click a button in our platform. It shows a special image that it uses to know calculate what's called homography with the television.
00:20:45
Speaker
um Find that that perfect calibration and then it uses that math um The math that it derives from the calibration image to find that perfect ah Point of reference. Yeah. Yeah. Yeah, so it's basically automating the test card of old That's right. that's right okay I'm already getting a sense that we're going to be solving so many micro problems along the journey to getting this whole thing working.
00:21:13
Speaker
is a lot of my yeah that That is basically my life, yes. so you ask You asked the question that you know we're we're on a journey to get there, which is how do you automate testing televisions? It's like ah building an apple pie from scratch. you know We first must invent the universe. We first must invent every single possible way to automate interacting with the television. Otherwise, it will consistently break in new and spectacular ways every day. yeah I guess that's a competitive advantage as well, right? So, in the long run. That's right. but yeah That's right. Okay, so you've reached the point where you can stream WebRTC out of a television effectively.
00:21:54
Speaker
But Chris, when you agree that it's pretty boring to stream a television that's not on, you'd probably want to be able to turn it on too. OK, yeah. So OK, we have the problem of input now, which is, OK, we've got these cameras pointing at a television in a ah dark box. And in order to do something useful ah with it, we're not even getting to the problem of uploading you know your your build to the television. and We'll get to there later.
00:22:24
Speaker
How do I turn the thing on and how do I navigate to a channel? How do I change the settings? How do I do all these things? um So we have an IR diode in there. We've got something called CEC, which allows you to interact over HDMI. For you know each individual platform, we've got web-based and gRPC-based protocols for sending messages back and forth to the television. We've got all of these different ways to interact with the television that we have to solve through hardware, and we have to solve through and ah different types of APIs right on the box. like
00:23:01
Speaker
So I'm imagining the... You say the two main control... Two ways you can say controls in television are CEC, you called it. That's the one over HDMI, right? That's right. That, I would guess, is fairly standardized. Infrared, ah you are you having to manually train the specific remote control? do you Is someone sitting there when they buy a new television going, okay, this is the up button, record that?
00:23:31
Speaker
Well, I showed you my my flipper zero for a reason, which is this is pretty useful for intercepting IR commands and reverse engineering how they work. Now, some of them are published, right? So every single television is going to have slightly different IR protocols for how you might send commands. ah So there's the basic stuff like turn on, channel down, channel up. There's more complicated things like I've held down the volume down button. That's now a long press. Each television manufacturer might have slightly different ways of implementing that IR. Yeah, that didn't surprise me.
00:24:10
Speaker
So we have a PCB, a printed circuit board, which has an IR, actually multiple IR diodes connected to it. And we'll send those ah IR signals over the IR diode, and that will communicate with the television just how your very basic remote would. um You said that CEC, that must be a very ah ah ubiquitous standard. and And you can imagine CEC is like,
00:24:36
Speaker
ah Again, sets up boxes. We're going to keep saying it. Fire TV, you turn on the fire TV that turns on your television. The way that that works is that the fire TV is sending a CEC standby command to the television saying, hey, turn on. um The unfortunate thing is that.
00:24:55
Speaker
While they are standard, quote unquote, they all behave slightly differently with each manufacturer and not everything always works as expected. So we have many different things on our platform that's really tuned for the particular platform to work most reliably. um yeah And this actually goes with communication in general. We have to make sure that On any one platform, not only can we provide all the capabilities, IRCC, whatever, um but also be able to choose. In certain situations, you might want to test IR, but you also might want to test the Roku-specific API, ah for example. So we have to provide all of them. And we'll usually choose the standard one for when you're ah trying to interact through our web platform, ah the most reliable for that.
00:25:45
Speaker
Right. i I feel like this is going to be a disappointing answer, but I have to ask the question. do we At any point do we get a robot arm pushing the button on the television? so Our first prototype was a severed hand that would poke around at the television. Like singing from the Adam Stanley. That's right. Again, you know just like Tom Cruise, we didn't think that was going to scale, so we had to decommission that that particular path.
00:26:11
Speaker
um i was i was I was hoping that in just a few cases you'd have something like... Okay, so there's... That makes me think of one other big input that you haven't mentioned to the television, which is power. Do you have some kind of power control so you can hard reboot the television when it crashes? ah We do not. We have, I think, five different power controls for... Five? Okay. Let me do that.
00:26:40
Speaker
So, okay. Why is one not good enough? ah Man, so. OK, I come from a background of Erlang and Elixir and the Beam, and the philosophy is let it crash. um The problem is that with when you're managing hardware, things don't always crash in a clean way. They clean they they crash in ah dirty and weird ways, and they'll manifest in just things will crash loop and generate a terabyte and a half of logs.
00:27:14
Speaker
or um ah your cameras will just you know die and and stop working. And so there's all these different little mechanisms that we have that are monitoring the state of our system and trying to ah detect these ah degradation and quality. um And depending on what fails, we might want to power cycle the device at a different level. So the first level is the device itself. So you can, through the remote, you know turn it on and turn it off.
00:27:45
Speaker
yeah The thing that you might know owning a television is that turning it on and off might not actually turn it off. It might turn the screen off, but it might just go into a standby and then it's just ready to go back on. yeah and so That standby is not always you know useful enough when you have like some weird state that you're in and you want a power cycle.
00:28:04
Speaker
so The next level on the hardware is Power cycle the actual ah power going directly to the television itself. And for that, we have a power relay in our boxes that is able to turn the device on and off. um Kind of sibling to that is our software that is running next to these devices. um Sometimes that goes wonky. And so believe we have Docker containers running. ah We might recycle those Docker containers. So that's another way of turning it off and on.
00:28:36
Speaker
um we might the compute that's on the device itself we might need to turn that on and off and then if all else fails we have a relay going to the actual wall and will turn off the entire power going to the entire box off.
00:28:51
Speaker
Wait a few seconds and back on again. So you've mentioned that we haven't really looked at it. There is some kind of computer inside this box. So the cameras aren't trailing a wire out to some central or server. the camera There's local processing inside the container.
00:29:09
Speaker
That's right. so every single So every single one of our devices has um basically a beefy ah but GPU driven computer that's sitting on the device itself that's running our specialized software and bundling up a bunch of um a vendor-specific software that allows us to communicate with the television. you know They're all connecting up to our web web platform, which is you know saying, hey, someone wants to use an LG and you've been ah selected because you're available. you know Start this session and initiate this WebRTC connection ah with this particular user.
00:29:50
Speaker
um so All of that is running you know Elixir on our um on our devices, but we're doing a lot more than just um you know taking frames off of cameras and audio and shipping it. yeah We're also doing machine learning on every frame. We're also doing ah computer vision and audio processing. and ah you know recording the video and shipping it out to the cloud so that you can watch you know your sessions after ah you know you've interacted with it. um So all of that is happening on this specialized piece of hardware that's connected to our printed circuit boards that has all these different peripherals that ah we can control through our software.
00:30:30
Speaker
um We also have you know our own power relay. We have a display on the device that's driven by a little C++ plus plus program. We have to tell you the temperature of inside of the box. And it can send an alert up to our web platform if it's crossing a certain threshold. ah There's all these little little things that we need to be monitoring at all points in time to make sure that our boxes are healthy and safe and all that fun stuff.
00:30:59
Speaker
Do you also have, I mean, this is slight aside, but do you also have, like, calling, and how does that interfere with capturing the audio? That's a very good question. ah We do have cooling. We have um kind of the ambient cooling of the room. We have you know air conditioning that's pumping in. um But we also have, on either side of the box, we've got um two fans that are ah blowing air in and out of the box to provide circulation. So they're not blowing air in. They're blowing air through the box, which provides yeah enough cooling to keep the the boxes ah at the right temperature.
00:31:38
Speaker
so And you asked about audio. Yeah. You asked about audio. so um
00:31:45
Speaker
audio is the the The noise of the fans ah isn't necessarily a problem because we don't have a microphone in there. you know we do have the We don't have HDMI out, but we do have audio out. um So we are able to capture. Oh, do you always? Because I would have thought in for a lot of televisions, you do. And for some, you don't.
00:32:03
Speaker
um For every device that we're working with, I believe we do have the audio out. I guess I would have noticed if we didn't because you'd hear quite a bit of noisy chatter of yeah you know whatever else anyone is to any of one of our clients is testing at any yeah point in time. yeah Actually, that raises the thought that if you walk into that room at any point, it must be a cacophony of different television channels.
00:32:30
Speaker
ah Yes. It is, but but it's you wouldn't see them because they're contained contained in these boxes. There's like um the vent where the fan is. you know You can actually see a little bit into the box, but for the most part, it's it's very much enclosed. and um yeah you You'd be none the wiser and it's actually pretty quiet. We do have two fans, but they're not making a ton of noise. they're um It's a surprisingly quiet operation.
00:32:58
Speaker
okay Okay, so we've got ah hora these boxes. that was That was one thing I wanted to pick you up on. you're controlling this Your central controlling software is Erlang, right? You're running the beam on the box. That's right. Why Docker as well? They seem to be slightly overlapping in what they do. They do. um so we We've tried many, many, many different approaches here. So this can go into a whole side conversation. and And I do need to remember that you asked me about how we automate working with these televisions. So we have to work our way back to up. Yeah, we're getting there.
00:33:44
Speaker
um When we first started out, you know we' were we were basically just shipping code directly onto our compute and you know running it on there natively, um which is um very easy in a lot of ways.
00:33:59
Speaker
The problem that we ran into, we're we're building on top of NVIDIA hardware. And the NVIDIA hardware poses a lot of challenges, especially when you're working with native libraries. um So we're working with OpenCV. We're working with ah a technology called Gstreamer for our multimedia pipeline. We're working with... um What are some other native stuff we're doing? I mean, kind of the machine learning stacks and you know a bunch of Nvidia specific GPU stuff, ah camera drivers, all of this. um It becomes very difficult to be able to ah ship your software to all these different places and have it run reliably and to make version upgrades and changes ah doing it natively on the box. I'm not saying it can't be done, um but what we found is that Docker just be became a lot simpler for us to be able to deploy new software. We have a build box that's running on an NVIDIA Jetson somewhere in our office. It's hooked up as a ah GitHub Action CI runner.
00:35:08
Speaker
And we're able to build an image, push it to our local Docker registry. All of our devices, when they're done with the session, pull it down, ah you know power cycle themselves. And it just becomes a very simple way of deploying our software. The other challenge that we have is that We're running a bunch of third-party software. Not all of it is designed to be run on ARM. So we have the problem of, how do you emulate certain software? ah So we've got Qemu running, Qemu. I don't know how you say that correctly. But we have, you know. No, I've always wondered. QEMU, that one, right? The the queue emulator Q QEMU, yeah.
00:35:48
Speaker
So someone will tell us in the comments how to pronounce it correctly. I hope so. Because I've been wondering for for some time now. So we have the problem of like how do you run this esoteric version of of Java 7 or 8 that's you know way too old and has all these issues. How do you make that safe and run it on the box? um If any of these fail, um we do have things that segfault need to restart them.
00:36:17
Speaker
so Docker is kind of like maybe bloated and way more than ah necessary, but it does give us these really nice way of releasing and pataging and packaging up our software and allowing us to really rapidly release new versions to our software ah without having to think too much about the orchestration.
00:36:38
Speaker
Right, right. So are you then not using the kind of hot deployment um features of the Erlang beam? Are you just doing it through Docker? Okay.
00:36:50
Speaker
We're not, there's I mean, we're not really keeping at least in our, you know, our edge compute, let's call it. We're not really keeping so much um state that needs to be maintained between one session to the next. A lot of that is managed by our web platform that, ah you know, keeps cues of, you know, user A wants, ah you know, this particular device and you know user B wants you know something slightly different. ah But there's one device that matches both of their criteria, who gets it first. That kind of state is kind of distributed in our web platform and and a whole different set of challenges. um And that maybe would be useful to use the hot code reloading features of the Beam. But we don't actually use that anywhere. We try and have other different techniques to to make sure that that's reliable and it works well.
00:37:42
Speaker
Okay, in that case, it is time to zoom out of the box. You've now built a box in which you can shove a television, you can tell it things and you can see what it's showing back. And that that box is now on the network. What do I do next to build out this service?
00:38:02
Speaker
Well, um the first thing I think before you automate televisions is that to know what you're automating, it's often very helpful to be able to run through it yourself. So I can imagine I'm sitting here with my remote control. Look at this. I have one right here. yeah So I ah turn it on. I go to the you know the page I want to navigate. yeah I go to the thing. And while I'm doing so in my platform, I'm taking screenshots of this.
00:38:29
Speaker
um So we call this our access platform. And how this works is our system, you say, i want give me any LG. And it will pair you with the first available LG. It doesn't care um you know what year it is or whatever um whatever constraints. You can make a very broad request. Just say, give me the first thing that's available. and okay I can go and test on it, take those screenshots. And now I have a good sense of the path for which I want to test. Then I go into our automation product.
00:38:58
Speaker
um This is a very visual tool for interacting with the system. And we allow you to basically take those screenshots that you just took in your access session and mark them up. So um imagine that you wanted to go to, um let's make up an example here. There's there's some um some apps where you might have like a channel guide. um I'm thinking of one very particular app developer, but I don't want to use their name. So imagine you have like a channel guide where there's all these different shows that you could watch on their app. And so you want to navigate over to that. You want to pick a particular show. You want to watch it. You then want to pause it and test that when you scrub back and forward, that it actually moves you back through content. And then when you hit play, it serves you an ad. That seems like a very reasonable thing to want to test. Yes.
00:39:56
Speaker
So the way that this works on our platform, because we're trying to target across all these different devices, using something like you know HTML, you know hooking into your code or using you know the structure of HTML to test semantic things in the DOM, it just doesn't work well for this type of testing. And so we've taken this very visual-based approach where um you kind of tell the system, you give it visual cues for,
00:40:25
Speaker
You know, if this um this logo, if it appears in this past on the on the this place on the page, and there's this text that says, ah you know, home in this particular part of the screen, and this thing is dominantly, you know, this particular color of yellow over here, I have a pretty good idea that I'm on the homepage.
00:40:46
Speaker
So, right so you'll draw some rectangles. Right. So our system can then take that and ah it will be able to to you know process every frame and then look in those rectangles that you drew and say, what am I seeing here? Am I seeing you know this particular piece of text? So I'll be running you know different types of machine learning to pull this information out, you know pulling bounding boxes and ah you know using OCR to grab text. um From that, you can build a workflow. So we have a drag and drop tool. And you'll do something like open this app, wait for the home page to load.
00:41:29
Speaker
Check. Am I logged in or am I logged out based on the presence of you know the the login modal? Yeah. if i'm If I'm logged out, go through this workflow. If I'm logged out, go for this other workflow.
00:41:41
Speaker
OK, now I'm logged in. Now I need to go to that channel page that we talked about. So issue these set of commands. um Use you know generative AI to you know navigate around the screen, because now maybe there's some complex menu that I need to navigate, and the content changes dynamically. So know we'll grab a frame. We'll generate the series of the sequence of up, down, left, rights, and enters to get to where you need to go. Then we'll use our. ah vision-based system to, you know, video starts playing, we're grabbing frames and our system can kind of understand the content that's playing and be able to draw some certain types of information from it. And then, you know, we can detect the ad based on maybe some content or the presence of, you know, some text on the screen, or maybe we know the ad or we can fake it by, you know, intercepting the network. ah And so you can do assertions in our platform based on that.
00:42:38
Speaker
um So this whole visual system is kind of how we can pose these tests together. pause that for second Right. yeah so So I do this markup stage first before I manually against one known television. And then I can script out the whole thing, which you will then compile down and send to the individual container.
00:43:05
Speaker
So you found me because of Lua on the beam, and this is where this enters the picture. Right, okay. So I'm trying to describe this to you without a visual aid, but we have this drag and drop workflow builder. right and And as I said, open app, wait for the screen, enter these series of commands, maybe use the keyboard to enter some text to search for a particular piece of content. You could do it all through this visual tool that's designed for QA, who are maybe not necessarily ah developers. That workflow that you see visually
00:43:41
Speaker
compiles into Lua code. This Lua code is using our our DSL that we've built. um And that gets shipped down to our physical device. That physical device implements that DSL, which is using computer vision and machine learning behind the hood, but also interacting with our hardware APIs and maybe some of these vendor-specific APIs to actually run through the test, run through that workflow,
00:44:08
Speaker
make certain assertions, um you know run loops, do whatever it needs to do, and then emit events back up to our web platform that then get visualized for you. And you have this like really interesting visual way of seeing not just the recorded video of your test being run, but the state of the system as it's running through that. So the system will be like, OK, right now I know that we're on the home page based on what you've told me about the home page. This is the network ah activity while this is happening. This is the audio level while this is happening. ah These are the you know elements on the screen that you've told me about. I recognize the presence of this thing. This thing is not here.
00:44:52
Speaker
And from this, you could build this really powerful way of navigating and making assertions about content and then reporting it and being able to see changes between different runs.
00:45:05
Speaker
Okay, so if I then go to test this against a dozen different kinds of television, am I going to be able to step through and say, okay, they all made it to the login page, they all made it to the yeah channel guide page, but one of them didn't make it past the channel guide guide page and got stuck. And that's the one I need to do more testing on.

TV Labs' Role in App Development Support

00:45:28
Speaker
So when you when you write a test on our platform, you can then schedule it and say, I want this to run on these five different devices. And because it's visual, and because most of the time ah the look and feel of these apps is pretty consistent across devices, this becomes pretty powerful for writing these end-to-end tests that are implemented into completely different technologies.
00:45:54
Speaker
And so what you're pointing out is that you can find inconsistencies in implementation or bugs or maybe slowness or buffering or just some issue that you weren't expecting on one particular device. um And so you might run this test and you might run it against, you know, I make a code change in my app, I'm going to sideload a build.
00:46:14
Speaker
ah through the TV Labs platform on my CI, it then runs on these devices and it reports back to you. It worked great on Panasonic, it worked great on Sony, it worked great on Samsung, but on Fire TV you had this issue. And then you can go in, see the recording, see what the system was doing,
00:46:31
Speaker
as it was navigating through your test. And then you have all this information, including you know the device logs and kind of all the telemetry that we're collecting, also networking network calls and things like that, that you can go and inspect and figure out what went wrong and maybe go into your logs and correlate some sort of issue.
00:46:50
Speaker
Yeah, because remembering the the bad old days of the browser wars, um my expectation would be that it fails on something most of the time for some dumb reason. and you Yeah. Okay, so i've never I've never really been happy with this kind of front-end testing because it's always fiddly and messy, but that doesn't sound if if If it doesn't sound ideal, it at least sounds like we're approaching the best that can be done. Is that fair to say? I think so. I think the ideal experience is that you have um something akin to you know open telemetry running inside of your actual app, and you have all the traces and um you know everything that you might need to be able to debug the issue. And you might actually have that.
00:47:49
Speaker
um And you might have that in production, you'll have you know your analytic systems that can do measurements. um But we're talking about ah many different stages in the software development lifecycle. um And you also have to think of the ah volume of changes and the audience and the fragmentation of where all these apps are going. you know They might be going on ah i don't I actually don't even know the count, but it could be hundreds or you know I don't think thousands, but maybe hundreds of different target platforms with slightly different versions and things like this. We have the same problem in browsers, but the the technology is more consistent.
00:48:29
Speaker
um yeah so Solving these problems um is very challenging. And the the point that we're trying to come in is that we can help you in any different part of the software development lifecycle. ah And we're trying to provide you tools that provide all this different telemetry where when things are flaky, which they will be, yeah um we can give you more indication for why they're flaky. Was it because the network was being finicky and we couldn't load the content from this particular ah CDN pop or
00:49:01
Speaker
ah Maybe it's because on this one particular platform, you actually do have a legitimate bug that you need to fix. um We're trying to help you isolate those and to be able to run it on these different platforms so that you can find these inconsistencies and be able to fix them without having to manage this massive fleet of hardware and have ah you know an operations team to manage it. We we try to do that ah solve all of those problems

Complexity of TV Remote Navigation

00:49:26
Speaker
for you.
00:49:26
Speaker
Yeah, yeah. like I see that as there's for business pitch. Yeah, and it makes sense, right? Because that that would be hard to do in-house unless you're one of the huge players, in which case you're probably going to end up trying to solve the same problems that you're solving, or just throw bodies at it.
00:49:43
Speaker
And even if you have you know the budget to throw a lot of money at it, these are really challenging technology problems. And it's not just like one, there's one big problem. It's like many, many, many different small to medium and large sized problems. Let me ask you about one specific one, because it's like a devil's in the detail type thing that comes up in these. I have found as a user of like TV apps.
00:50:09
Speaker
that sometimes you're trying to navigate around a 2D space that wants a mouse, but you've only got up, down, left, and right on the remote control. So getting the cursor from where it currently is highlighting to the thing you actually would like to click on is sometimes really counterintuitive. right some Sometimes you click left and the the the highlighted button jumps to a place you didn't expect.
00:50:36
Speaker
How on earth does machine learning get around the potential craziness of navigation you get in televisions? Well, we have the luxury of being able to view many, many, many, many different sessions of interactions with televisions. com so um And I'm not trying to claim that we've solved this problem. I think this is kind of one of our biggest challenges that we have to overcome, which is, can we really you know build an agent that can truly understand what you're seeing on a TV, understand what is navigatable and what's not what's clickable and what's not? and be able to infer um any different one of these things. um There's a feedback loop there, which is if I press left, what changes on the screen? Am I able to recognize what is actually you know currently highlighted on the screen?
00:51:26
Speaker
ah To some extent, we're able to do that. To other extent, it's really specific to the app and the platform you're on. This might be, as you said, I'm imagining these like ah kind of like sidebar navigation things where these things like pop into existence that you didn't expect and you know jump over. yeah Or sometimes you hit left and it just stays exactly in in the same place. And it's only when you hit the back button does it jump you back to the menu. Yeah, crazy navigation in TV apps. Definitely a thing. Yes. yeah So I i think this this problem is solved um over time by being able to train on um an agent that is able to um
00:52:08
Speaker
be anchored at any one point in the screen, make a change, observe what changes on the screen. And then if it has the ability to infer what are the pieces of content that are navigatable, clickable, ah you can sort of build like a graph of understanding of this is how I move across the app.
00:52:27
Speaker
um And if you can do this with enough time and enough examples, you can actually build a whole map of how to get from point A to point B ah through enough of these iterations. So I think that's kind of the goal is if we can automate that, then we have a really powerful system that could actually probably crawl your your ah TV app and find your issues for you.
00:52:50
Speaker
But we're not yeahre not quite there yet today. I can see how you've got the training data, if not yet the solution. That's right. Okay, so I think we should then get into ah scheduling, because this is clearly a problem. How many different kinds of TV do you actually have in in the warehouse? What kind of numbers are we talking about? um I don't want to reveal too many numbers. Give me a power of 10.
00:53:18
Speaker
We'll be in the hundreds. okay The challenge that we have is it's not just about like different types of device. Because for any given manufacturer, they probably have a low-end television, ah medium, and high-end device at all different price points. They also have them at all different sizes, and then different capabilities. Maybe some are built for Uh, something called like, uh, Oh man, I'm b blanking on this. There's a new standard for streaming that is kind of going to disrupt app stores. Um, and there's something called three dot O ATSC three, three dot O, which is a combination of broadcast and you know, internet protocol coming together. So you might have certain devices that have these capabilities that I think are called next gen televisions. You'll probably be hearing about this in the next few years. Right. Um,
00:54:13
Speaker
We need to have kind of all of those. But we what's actually really key is we don't have to have all the different sizes. That tends to not matter so much. What matters is the low end versus the high end are going to have very different capabilities and have different performance capabilities. But the 42 inch versus the 65 inch, probably going to be about the same.
00:54:36
Speaker
And so we've optimized our fleet to be able to carry 55 inch televisions because most manufacturers have that size. It's pretty popular size. um Now, I mentioned much earlier that the problem that we tend to hear from our clients is not, do you have this latest and greatest um device? It's do you have this like device from seven years ago? That's 8% of my audience is still on that we're having this one bug on that's yeah you breaking all all ah all streaming. you know Maybe ah you know a very old boxer is you know fighting some YouTuber. and Yeah, yeah.
00:55:17
Speaker
You know, we we have to solve that problem. Yeah, because I imagine most people only change their television once a decade, so you've got to support the old stuff. That's right. So so we have, I think, going back, like, ah It's like six or seven model years for most devices. And you know our inventory tends to be smaller around the older stuff and you know more volume around the the newer types of devices. But then you we have these shelves of set top boxes that we keep talking about where those are really easy because they're small. And you know in a small volume, you can have a lot of them. yeah But it's the the televisions that that pose our biggest challenge.
00:56:00
Speaker
OK, so ah I'm imagining that you've gone through this warehouse of hundreds of set-top boxes and televisions, and you've tagged them with metadata. Like, I'm an LG device. I'm 55 inches wide. I've come from this year, and I'm running this operating system, right? That's right. And then I, as an app developer, say I want to test on one of each of the major brands. Now you have a classic scheduling and queuing problem.
00:56:31
Speaker
right You've got to ship that test to the available TVs that match those criteria. Tell me about that. Sure. um so we have We called our demand system. and it's It's a very boring name, but it takes demand for particular devices. And it matches a user ah based on a priority queue of, um you know do I have a license? So we we charge for our product based on concurrent access to televisions. So depending on the number of licenses that you have,
00:57:08
Speaker
um We will allow you to to connect to that many televisions at once. So if you have three licenses, then you can run a test. And at the same time, we could run it on three different devices. um But you are also competing with your teammates who ah you know might not like that.
00:57:24
Speaker
yeah so we will um each Each one of our clients sits on a different queue. And so if you queue up 100 different tests, we will try to run all those 100 tests for you. There is a max time for how long we'll let something sit in the queue. But you can schedule up all this different demand, and we will serve it to you based on the number of concurrent licenses. But we will prioritize ah those requests
00:57:53
Speaker
globally ah across all the different organizations who might be using our devices in sort of a round robin fashion. So what this does is that ah it minimizes the amount of contention between ah you know different organizations who might be competing for the same devices.
00:58:11
Speaker
um And we'll we'll prioritize based on time. But if you haven't had ah a request recently and someone is putting in a lot of volume, that will, over time, start to decay a little bit and get prioritized a little bit lower than the person who's just like, hey, I'm here to test today. And you know I'm not sending so much volume through our system. Yeah, yeah. So you are partially optimizing for throughput there.
00:58:38
Speaker
That's right. Individual, yeah. That's right. ah So we want our system to be utilized, and we want it to be you know as high. We want to be able to use our capacity to ah its full extent as much as possible. yeah um And so to order to do this, we keep, I guess you can kind of think of it as like a distributed database, but we keep an in-memory database that uses CRDTs to track the current state of each device in our fleet.
00:59:08
Speaker
OK, that's interesting. I wouldn't have thought CRDTs come into this. Tell me tell me more. OK, so our web platform needs to be able to serve ah many different geographic regions because we have customers who are in the US. We also have customers who are in Europe and the UK. We have customers who are ah working in Asia and you know places like India, you know Tokyo, things like that.
00:59:34
Speaker
So our web platform you know is running different ah edge compute all over the place to be able to to have a nice experience for those customers because it's a very interactive ah platform that we offer. um The problem here is that each individual device can only be used you know by a single user at a time. Yeah. yeah um So um in that sense, each of one of our devices is its own source of truth for, am I available or am I not?
01:00:06
Speaker
um However, sending a request to every single device and being like, hey, are you available um to take take on a new session is you know not very efficient. um And also, those devices I mentioned, they you know connect up to our web platform. They might not be connected to the same node that you're currently interacting with. you know I'm connecting to our web platform. I'm using Phoenix Live View to interact with but the web platform. um i in I'm sitting in London, whereas we have a server sitting in ah New Jersey. ah so Our device actually might be connected up to that device in New Jersey.
01:00:45
Speaker
now someone else Some other user in San Francisco might have just used that and now it becomes available. How do you know about it? ah This is where the CRDT comes into play. So the CRDT is ah keeping track of the current state of each television. Is it available? Is it online? Is it in use? Is it healthy? um Is it available for connection?
01:01:06
Speaker
um And we basically use that, our our demand system, is querying that thing as kind of like the first check for, can I serve this this user request right now? Or do I need to keep them waiting? And we'll we'll let them know that you know there's nothing available and that we expect something to come online soon, or that their teammate is currently using it. And all this information is tracked by the CRDT.
01:01:30
Speaker
um but But when it comes time to, okay, we now think that this device is available, we've selected you to pair with it, we'll then initiate a session. And the way that this works is that even if you're sitting in London, but the device is in New Jersey, ah through our PubSub system, it will send the ah request for the session down to that device. And now, because that device is a source of truth, it will either accept it and by sending a message back and saying, let's initiate a WebRTC connection. Or it will direct it and say, hey, I'm actually unhealthy or, you know,
01:02:06
Speaker
It reported my session as available, and I'm actually unavailable due to some bug. um But that that ends up being the source of truth. The CRDT is the thing that keeps it fast and is kind of like this in-memory database for um and knowing who's available and online.
01:02:23
Speaker
Okay, so it's kind of like um you're connecting to a central server to find out which peer-to-peer connections you're allowed to make. That's right. Sort of. Yeah. And the demand server is connecting to the local database, which replicates the actual source of truth, which is the individual container. That's right. I think I have that in my head.
01:02:51
Speaker
okay And when we say server, we're talking about, you know, we just have our um Elixir apps running in the cloud. There's not like all these different microservices that are responsible for one thing. It's just one web platform that has all these different responsibilities managed by Gen servers that use things like hash rings to distribute the the load across all the different servers globally.
01:03:16
Speaker
I was going to say, is this is this does this actually end up just being a bunch of Erlang actors? And is that a reason for choosing Erlang? Well, there's many different reasons for choosing Erlang.
01:03:29
Speaker
so Okay, there's a lot of problems that that we're trying to solve that I think Erlang and Elixir and and the Beam VM, the Beam virtual machine are very good at. So the Beam virtual machine is really good at fault tolerance. you Let it crash. We'll restart it for you if it goes down. yeah um It's good at ah low latency and and you know decently high throughput systems. um And it's really good with building systems that are ah relatively simple due to immutable data and functional programming.
01:04:06
Speaker
um And then there's the whole distribution aspect, which is being able to connect nodes that are globally distributed and have them communicate to each other and send messages back and forth and ah subscribe to different topics of interest and not really have to worry about the orchestration of that all.
01:04:24
Speaker
Yeah, and what's really good bread and butter programming in there, right? It's not a weird thing. Yeah, that's right. And so what's, what's so interesting, I think about elixir for this particular solution is in any other programming language, you'll probably have to have a constellation of other services that are just for doing the communication. Um, like if we take web sockets, for an example, All of our different devices, TVs, and set-top boxes, the way that they communicate up to our web platform is that they make a web socket connection. For most other systems, you'll have to have a separate service just for holding on to those web socket connections and then using something like RabbitMQ or something else to do eventing to all your other systems. We could just do that in Elixir and Erlang and that's really easy. But the part the thing that that enables is so interesting is you can now do distributed PubSub where I can um click a button in the UI that emits an event and that is just PubSub directly down to the device that's listening to a particular topic and it acts as this like giant distributed system.
01:05:32
Speaker
that's heterogeneous, that's actually running very different code, but it can um you know communicate with all the other nodes in the system with very little effort. and There's this term that I think that's been going around in the Beam community that I really like and that I think is comes to the core of why I like it so much, which is operational simplicity.
01:05:52
Speaker
In order to do this, it takes a very little effort. It takes very little effort to build these kind of tools and to network it together is just through DNS. um To do the WebSocket connection is like these standard tools through the Phoenix framework that um are just very available and easy to use and would require quite a lot of specialized knowledge and ah operational complexity in order to deliver in most other platforms.
01:06:22
Speaker
yeah yeah Now, that's not to say that we don't use a lot of other technologies. We've got Rust, we've got Go, we've got C++, plus plus we have Lua, we have JavaScript. i But Elixir sits at the core and kind of solves these ah fundamental data distribution and communication problems for us. Yeah, yeah I can see if I were just taking this on paper. i would By the time you've got to obvious complex failure modes,
01:06:52
Speaker
Interesting networking protocols, WebSockets, WebRTC, lots lots of scheduling stuff and queuing. I'd be surprised if a elickx if um at least Erlang in one of its flavors wasn't on the shortlist of so technologies you use.

Why Choose Elixir for TV Testing?

01:07:08
Speaker
Can I ask why Elixir rather than pure Erlang or maybe Gleam?
01:07:14
Speaker
um Well, the reason I left Bloomberg back in 2016 was that a manager of mine, um I was kind of interested in Elm at the time. I had oh yeah done a little bit of a stint of a Haskell in my programming languages class and in college. And when I joined Bloomberg, I was doing ah quite a bit of C++ plus plus and Fortran, and that's a whole other conversation.
01:07:39
Speaker
Um, I then dip my toes into JavaScript and I mentioned that whole, uh, single page app framework. Um, but I was kind of like itching for something else. And I got into this, uh, functional programming kick with Elm. And I'm like, this is so cool. My, my manager was like, Oh, you like Elm. You'd probably really like this, uh, this thing that this guy in the Ruby community, uh, has been talking about, uh, called elixir.
01:08:03
Speaker
And so I was like, Oh, let me try that. And what I found with elixir was this idea of pattern matching, which is something I'd never seen in any other programming language. Um, and that was the thing that got me hooked. And so I left Bloomberg to go do ah a media startup called the outline back in 2016. And I basically have never looked back because once I found elixir, um,
01:08:28
Speaker
I just found the way of solving problems just so approachable. And um it just made things that I thought were really hard in the other languages I've used before really elegant and simple. um And I was able to approach um really interesting problems. you know i I was at a sports betting company for four years. And we were doing a lot of real time data processing. And we built almost the entire platform in Elixir.
01:08:56
Speaker
um It's just something that I think is, I want to say it's part of my identity, but I have built you know a lot of knowledge and and expertise in it. And I find that um you know I'm ah able to do a lot with very little relative resources with the beam.
01:09:16
Speaker
And Elixir has just been my tool of choice. Now, I know you've had Louis Pilphold on the yeah the podcast. ah Him and I were, we sat down for dinner at a Code BMEU last month in Berlin. um He's been working on Gleam for the last... ah four or five years now, andm I'm not sure. And I've been sponsoring him since ah close to the very beginning, because I think it's just so great. ah The Beam is this amazing piece of technology that's, in my opinion, very underutilized. um And it's great to have more players and more ways to onboard people into the ecosystem. So um you know if there's something better that that comes out than a Elixir, I would certainly try it. you know I've got my ah i rewrite it in Rust poster right here.
01:10:01
Speaker
ah I love technology and I love trying new things, but um Elixir is just kind of this um this hammer that I fi find it it's useful in so many different contexts and is so capable um and allows me to you know to With just very few people build a product like TV Labs with with relative ease and kind of like a straightforward path. yeah see That's why I'm really excited about it. I can see both the natural fit with the domain, given the amount of networking and queuing and reliability.
01:10:36
Speaker
and Even the people I know who like have the broadest tastes and palettes for programming languages, they usually have a favorite one too, which is just the place you go to get stuff done, right? It's home. Yeah, yeah, yeah. I think mine at the moment is pure script, but I'm always tempted away slightly. I love pure script. And it's my language I choose when I don't want to think about which language to use.
01:11:03
Speaker
Wow, you know, I haven't heard of PureScript being used in ah in quite some time. The one that does keep popping up is, is it ReasonML? Oh, yeah, yeah. There's a very, is that the one that's very front-end focused and is kind of like, ah yeah has a really tight integration with the React? It's, oh, now we did a whole episode on this and I wasn't expecting a test, but yeah, it's like, if you want to take OCaml to the browser, that's where you go.
01:11:32
Speaker
I think it is re-scripted. I don't know, they might be related. My OCaml is... um I need to dabble in OCaml. there's There's some cool things happening in that community over there. Me too, me too. Yeah, um I need to spend more time with OCaml. There are only so many hours in the day. That leads me to ask, perhaps it took to wrap up, hours in the day, where are you where are you spending yours in the coming months? What's left to solve in this space?
01:12:03
Speaker
So we're ah we've been a business now for about a year and a half. And we're we've been working with customers and clients, but we're very much in our go to market phase. And so a lot of this is taking these fundamental ah ah products and features and making them very usable, really dialing in the UX. I think we've spent most of our time making the system reliable. and you know The five different ways of restarting our system is kind of goes into it where we've had to dog food our our platform and use it over and over and over again and run into all these stumbling blocks and it's only until recently where we feel that our platform is so reliable that it's it's really useful for other people too. um And so what we're spending most of our time on is in JavaScript land and Svelton is in polishing these tools and ah building these features for introspecting state and really being able to visualize what's happening in our system so that it's really easy to go and diagnose a problem.
01:13:06
Speaker
Um, and this is kind of like the very top layer of our system where all of that low level stuff underneath it, uh, is kind of abstracted away from you. And our product just seems like, you know, so boring and, you know, obvious that you'd be able to work with the television, but there's just all these little bits and pieces and hard problems that we've solved along the way, uh, that make the the high level thing really work. Um, so my time and my focus is, uh, you know, spending a lot of time leading the team to, to kind of.
01:13:37
Speaker
Solve these ux problems and make the the the product usable in a way to analyze your problems and find correlations and kind of build you know like the observability tooling part of our product. that yeah that makes a reported error in NCI easy to dive into, find the root cause, and jump back out of our product. Yeah, yeah. As hard as stitching together four live video streams in in a reliable, scheduled, queued, distributed network of television sound, it's probably still not the hard part. The hard part is making things usable for users. That's right. That's always the hard part. It's always JavaScript that's the hardest part.
01:14:22
Speaker
I'm not saying a word that you said it, you're now on record. oh no On that note, maybe I should leave you to go back to presumably Figma and keep working on the UX. That's right. Dave, thanks very much. That that is fascinating. and One day I'd like to go and visit the warehouse where all the televisions are kept.
01:14:42
Speaker
We'd love to have you. Awesome. Thanks. Thank you, Dave. As always, you'll find links to the things we discussed in the show notes. And if you've enjoyed this episode, please take a moment to like it, share it with a friend or a social network, and make sure you're subscribed because we'll be back next week with another... No, we won't actually.
01:15:01
Speaker
we won't be back next week with another episode because I'll be taking a break for Christmas. So happy Christmas if you celebrate it, and you know just general happiness from me if you don't. Thank you very much for your support during this year, especially if you're one of the people that supports us on Patreon or YouTube memberships and helps keep developer voices running. Thank you.
01:15:24
Speaker
Before we go, I'm going to leave you with a question. Dave mentioned the Apples and Netflixes of this world. And it got me wondering, you hear people in this industry debating what the plural of database index is. So I ask you, is the plural of Netflix Netflixes or Netflixes?
01:15:44
Speaker
I should be spending far too much time thinking about that over the holidays. So on that note, I've been your host, Chris Jenkins. This has been Developer Voices with Dave Lucia. Thanks for listening.