00:00:01
Speaker
You're listening to the Archaeology Podcast Network.
Introduction and Podcast Overview
00:00:08
Speaker
Hello and welcome to the Archaeotech Podcast, Episode 174. I'm your host, Chris Webster, with my co-host, Paul Zimmerman. Today, we give a few updates and then we talk about using deep learning for feature identification. Let's get to it.
00:00:23
Speaker
All right. Welcome to the show everyone. Paul,
Pre-trip Preparations and Equipment Anxiety
00:00:25
Speaker
how are you doing? I am doing okay. The clock is ticking down before I head back to Iraq and I'm getting a little nervous about not just everything I have to get done before I go, but also what the work's going to be there. But I keep on putting myself in these situations where I'm just a little nervous. I think I like it.
00:00:42
Speaker
like setting up a challenge and then you're rising to meet it. But we'll see. This feels like I'm pushing myself a little hard this time. You've got the normal travel jitters, especially going halfway across the planet to another country. And then there's the whole thing about, oh man, am I bringing the right supplies for this field work? What are we going to do? What challenges are we going to encounter there? I mean, there's a lot of room for anxiety.
00:01:10
Speaker
Yeah, I don't get travel anxiety, fortunately. So that's not part of it with me.
Thermal Imaging Tech and Deployment Plans
00:01:16
Speaker
We have the equipment that we need, with the exception of one thing. So I got for the project a very sophisticated FLIR Boson thermal imaging camera. And this thing is cute as a button.
00:01:31
Speaker
barely bigger in any dimension than a quarter. A little itty bitty thing that we're going to put on our drone. But we also ordered from a different supplier, a custom built gimbal and mount and batteries and transmitters and all that sort of stuff to go alongside it so that we can actually use that camera when it's up in the air. And none of that stuff has arrived yet.
00:01:54
Speaker
So I have the camera and no way to actually use it. So I'm hoping that that gets here within the next week because yeah, but if you're listening to this when it drops, I'll be leaving the very next day. So, you know, we're really running up against the wire here.
00:02:10
Speaker
Actually, have I told you what we're planning to do with these thermal cameras? You might be interested. I don't think we've talked about the thermal cameras yet.
Detecting Ancient Architecture with Imaging
00:02:17
Speaker
Yeah, so the site that we're at, Lagash, it's a very broad, flat tell site with not a lot of time depth to it in that most of the site, if you look at the shirts that are on the site, most of them are of the early dynastic period in Iraq.
00:02:32
Speaker
And when you look at the site from the air, you can actually see a lot of architecture that's buried just below the surface, so ancient architecture. And you can get a lot of the sense of the buildings there, the city wall, the neighborhoods, the plan more or less, the layout of how the city is put together.
00:02:55
Speaker
we want to use this thermal imaging to use alongside other kinds of techniques. So we have some visual imagery, photographic, and we're going to get some more in the spring. I was there last in the fall, and it was very dry, and so it was kind of crusty, the surface of the site. You couldn't see much of this buried architecture. When we're in the spring, it's going to be wetter, so we're hoping to have better visualization of
00:03:19
Speaker
that buried architecture that we're also going to throw other things at it. The first one that we're going to do is this thermal imaging to see if that highlights them in any way. We also have plans down the road to do magnetometry and resistivity to also get at where the architecture is on the site. It's one of a suite of things. We've got the camera again and don't have the gimbal and don't have the mounting hardware and don't have the transmitter and don't have the battery.
00:03:48
Speaker
So I think I'm a little nervous about that in terms of equipment. And then the other thing is just that it's a big project and I'm going to be there for five months. And there's been family stuff that's gone on in the meantime that has kept me very distracted. And yeah, there's going to be a lot to do. Did you say five months? I probably did say five months. I mean five weeks.
00:04:10
Speaker
Oh, I was like, what is that? That's how distracted I am. My wife daily is telling me about things that I totally hate or forget to do. Like we would bounce around between the city and Brewster and I forget to bring the dog's food with us. And so then I have to
00:04:29
Speaker
We got back up to Brewster the other day and she looked and there was a whole bunch of dirty dishes in the sink. I was like, oh yeah. Along with forgetting to bring the dog's food, I forgot to do the dishes before I left. So I have list and list and list of things that I have to do before I go. And that's probably where that anxiety is coming from, just looking at that list.
00:04:52
Speaker
I'm hoping I get through everything and don't leave piles of dirty clothes and remember to bring underwear and all that important stuff.
Travel Restrictions in Iraq
00:05:00
Speaker
I mean, you can buy underwear in Iraq. I'm sure they wear underwear there. Yeah, well, we're not allowed to go anywhere without escorts. Oh, yeah. Yeah, so we're going to be on the site and in the dig house and that's pretty much it.
00:05:15
Speaker
So yeah, I do need to have everything. I got to remember medicines and, you know, all the little piddly stuff that you always forget and always have to bring. And yeah, anyhow, blah, blah, blah. You are very experienced at traveling. So this is my segue. Where are you?
RV Lifestyle and Connectivity Challenges
00:05:34
Speaker
So literally today we moved the RV to Palm Springs, California. We're only here for a handful of days. We're in a RV park. We have what's called a thousand trails membership and there's various levels of that. So the level we have, we get free stays here. The caveat is if you stay here,
00:05:53
Speaker
Regardless of the number of days, I think that's true. You have to be out for seven days before you can come back in. And your maximum stay limit is 14 days. So there's some caveats there, and you don't always want to be in a park here. Like this one is pretty well cramped. Like there's pretty tight quarters as far as the RVs goes. The weather's beautiful. We're surrounded by, there's probably 400 palm trees on this park. The really tall ones, like the really cool tall ones that are like 150 feet tall, those ones are super neat.
00:06:22
Speaker
But otherwise, yeah. But see here, the cool thing is, and this leads into what I wanted to talk about first. We spent the last five days, I think we got there Sunday on just a dispersed BLM camping area south of basically south of Blythe, California, but on the Arizona side of the Colorado River. And we were camped right down in a depression off the road, which so it was kind of protected right next to a little, what I want to say is probably an overflow like flood channel for the Colorado.
00:06:49
Speaker
wasn't on the Colorado, but it was on what kind of looked like to me almost a manmade side channel for the Colorado. So that was super cool because we hardly saw anybody. We didn't leave the whole time because we filled up on gas and groceries and everything we would need, water, all that stuff.
00:07:05
Speaker
And we just kind of hung out and worked the week and had a great time. Didn't have AT&T cell service. Verizon was garbage. Sprint was kind of okay, but Starlink was perfect. It was like we were home because that's where we are when we go to the RV. We're always home. And that's one thing I really love about this lifestyle.
00:07:29
Speaker
But one thing I've noticed about the Starlink, and I mentioned this before about getting into different service areas, and this is an update I wanted to give from two weeks ago when we talked about this. When we first got it, I was looking around at different service areas, because you have to move your dish to wherever you're going. And the minute you move it, if the system accepts that address, you lose service where you're at. So it's not like you can just play around. You'll lose service while you're trying to play around. So if you don't have a backup service,
00:07:59
Speaker
to try to still access the internet and then move your servers back, well, you're done.
00:08:05
Speaker
You can't do it. So there's yourself in a corner. Basically, yes. Yeah. Unless you have a different, like a second Starlink account. But like I mentioned before, I mean, that costs you money, whether you have the dish or not. So it's not really cost effective to do that. So we kind of play Starlink roulette. My wife came up with that term today, where we're just like shooting around, trying to figure out where we can get a sell. And when I was looking this morning here in Palm Springs,
00:08:35
Speaker
I looked right at the campground, of course, you always start right in the center and it didn't say there was no service, which was good, but it did say it was at capacity, which means there's too many people here that have Starlink and Starlink doesn't have enough capacity to cover everybody. So I started looking around and I went north because there's mountain ranges just to the north of us, more rural areas. I went south, nothing. I went back west. East is just more city, so I didn't even try it.
00:09:00
Speaker
and wasn't having any luck. So I was like, okay, well, I guess, you know, for a few days, because we're leaving here on Saturday. It's Thursday now. We're not going to have Starlink. Well, we get here in the camp Wi-Fi. We can't even connect to it. The AT&T 5G is absolute garbage. I had a call right when I got here and it didn't work at all, basically.
00:09:19
Speaker
Our sprint isn't working very well because there's too many people here on it, too many people in this area. And Verizon was okay, but we don't have enough data on Verizon to really last more than probably a day and a half or so.
00:09:34
Speaker
it was like, what are we going to do? And my wife was like, let me see that Starlink account. So she just started bouncing around on Google maps and found something. I think the driving directions were like eight miles away on the point that she found. So as the crow flies, it's probably six or seven and it bounced the address right in. I was like, well, okay, let's try it.
00:09:53
Speaker
Sure enough, that's what we're talking on right now with this podcast, and it's working great.
Starlink Internet Experiences
00:09:59
Speaker
It's weird. I think it's because we're outside the cell, but we're getting anywhere from 30 to 90 megabits per second down and anywhere from 20 to 50 up.
00:10:09
Speaker
That's pretty good. When we were in Phoenix at the end of last week, we were actually in Apache Junction, which is outside Phoenix. I did the same thing, except this time I was 10 miles away from the address where we were at, and we still had Starlink working perfectly fine for us in a non-service area. The place where we were didn't have capacity. It said no surface here, and yet I found a cell outside of where we were. Now, you have no way to know where the edges of those cells are, but when I was bouncing out, just ratcheting out the
00:10:39
Speaker
the pin there trying to look at the Google Plus codes and bring them in, it was about 10 miles. So we were that far outside of an active cell and still were able to use Starlink. I'm very encouraged by that. So that's pretty awesome. Good, good. That's working for you.
00:10:56
Speaker
Yeah. I mean, it's, it really is allowing, we were, I mean, I have a new client and they're in, they're in Belgium and they have people on the call that are in the Philippines and Australia. And the only thing that worked for everybody except for me was we have meetings four times a week, uh, for two hours each meeting. And it's at 10 PM for me, 10 PM to midnight.
00:11:22
Speaker
So because of that, I was like, man, I was already starting to look around before we got the Starlink to work. I was like, there's no way it's going to work here. I have to screen share. We have video turned on. There's no way it's going to work. And I was already like, OK, Starbucks leaves their Wi-Fi on overnight. McDonald's leaves their Wi-Fi on overnight, even though they're 24 hours most of the time. So it's a great source of Wi-Fi because nobody's using it. And it's pretty high speed. And I'm like, where am I going to sit in the car and do this meeting at 10 o'clock at night? But now that we have the Starlink working, it's much better.
00:11:52
Speaker
That leads me to another update that we got from a listener. I had already looked because we weren't getting Starlink servers here. We're going to be in Visalia, California next week. If you're listening to this in real time and you're at the Society for California Archaeology meetings, because this will be day one on the day that this podcast releases, then stop on by the Wild Note booth. Rachel and I are running the Wild Note booth for the whole conference.
00:12:15
Speaker
We'd love to talk to you and even just say hi. Even if you're not interested in WildNote, just come on and say hi. If you're a podcast fan or something like that, so we'd love to talk to you. But we're staying at a campground near Visalia and I'd already moved the service there, but I was getting a nice little red bar across the top that says we're experiencing degraded service in this area and are currently exploring the issues or investigating the issue or something like that.
00:12:40
Speaker
And I had to wonder, is it related to what I talked about last time with the 40 Starlink satellites that were deorbited because of the solar flare? I don't know. I don't know if that's why they have degraded service there or not, because I know they were all kind of in the same area because solar flares are relatively targeted.
00:12:56
Speaker
But I wasn't absolutely sure. Which leads me again to one of our frequent and appreciated commenters, James. He always sends me emails about the episodes and he's just got such great, insightful things to say that I always look forward to his email. I don't always get a chance to respond to timely fashion, James, but I do read them and I very much appreciate them.
00:13:19
Speaker
But he's got expertise that tells him what the answers are. And when I mentioned that the solar flare hit the satellites, and I think we alluded to the fact that it was probably radiation or something that burned them up and then they deorbited. I'm not sure if I actually said that, but I also didn't really know what actually did it.
00:13:35
Speaker
But what James said is that the solar flare comes in. These are low Earth orbit satellites, so they're pretty close to the atmosphere. And what he said is the atmosphere actually gets superheated by a solar flare, expands like heated things do. Anytime you heat something up, it expands and the atmosphere expanded. And basically,
00:13:53
Speaker
I think I'm saying this right, James. It encompassed the satellites or got close enough to them that it added dramatically to their drag and drug them basically back into the atmosphere. So their atmosphere basically reached out, grabbed the satellites and pulled them back in.
00:14:10
Speaker
And I think I got that right. It's a nice visual, huh? I know, right? Yeah. So I didn't even know that was possible, but when you think about it, it has to be possible, right? So it's really neat the way physics works like that.
00:14:26
Speaker
I think I just wanted to give you that update real quick. Uh, and I wanted to, I want to mention something else that, you know, we don't really have time to talk about on this episode. Uh, but we'll finish out this segment talking about it back in June, July, something like that. I saw, and I think it was, I think it was on Indiegogo. I get that mixed up with Kickstarter, but I saw this thing for augmented reality glasses. Now I've seen various augmented reality glasses come down the line before, but these ones just looked cool. They,
00:14:53
Speaker
Even when you're not using the camera features, they just look like futuristic sunglasses. And I was like, well, that's super neat. So I put down the money for them because I think they were going to end up being $499, $500, something like that. But it was like $250 or something on the early bird Indiegogo. So I called it a business purchase for research purposes and said, let's do this.
00:15:16
Speaker
And I finally got them about a week and a half ago because they were super delayed. It's a Chinese company. It's not just an American company or something that's having cheap labor in China. It's legit a Chinese company. And they were having some issues with supply chain, like everybody else is, and then issues with shipping and getting everything out because they were supposed to deliver in December.
00:15:37
Speaker
But it didn't actually deliver until mid-February so but now that I've got them I can report on it a little bit I didn't get because I was actually too early for this I didn't get the little box that they have that you can actually you plug the glasses into this little box it develops a Wi-Fi and then you can connect your phone to that and then see your phone on the on the screen that's
00:15:59
Speaker
phone usage is primarily how they promote these augmented reality glasses. Oh, and I'll have a link to these in the show notes. It's called Rocket Air, R-O-K-I-D, Air. That's the name of the glasses. I'll put a link to these in the notes so you can go check it out. But the interesting thing is, first off, they don't have a battery internal to them. They don't work at all unless they're plugged in to a USB-C cord. And that USB-C cord has to be able to handle video. I know it's a little weird. It's not like you just
00:16:25
Speaker
using these out on the street, although they do show people doing that. What they're really intended for is Android users, and there's a list of about 15 Android phones that they're actually been tested with. I'm sure they would work on some others, but they've been tested with 15 different Android phones, and you download the Rocket Air app,
00:16:42
Speaker
And then that app, first off, they have a store that probably doesn't have anything in it yet, but it's going to have augmented reality apps inside of it. There is some augmented reality functions, which means you have basically an overlay on what you're seeing in the world. But to be honest, mostly what it looks like is just a screen in front of your face that's
00:17:00
Speaker
It's like if you were to take the opacity and drop it down to about 80%, you can kind of see through it depending on what you're looking at. But it's basically just a screen in front of you that is mirroring the screen that's in your hands. And I think there's going to be more functionality, especially if I can get the Rocket Air app to work because we do have an Android phone here for testing wild note things.
00:17:19
Speaker
And I can't get it to work very well on there. It's got some issues. So I got to figure that out. Otherwise, I have had great success using this on my computer, because the lenses are actually kind of small. And I'm able to look under them really easily when it's sitting on my face. I could just glance down, and I can see pretty much whatever I want. So I've actually used these as a second screen for my laptop, which it's less cumbersome than the Oculus Quest, because that's all-encompassing. Like, I'm totally immersed. Yeah, that's always what I ask about.
00:17:50
Speaker
Exactly, right? But with these, I can just use them as a second screen, plug it straight into the USB-C on my laptop, and then go to my display preferences. I didn't think it was going to work at first because it was auto defaulting to mirroring. And I was like, okay, so it's the same screen. But then I went in and just unchecked mirroring and I was able to
00:18:07
Speaker
basically take the rocket display and put it over the top of the laptop display and then I could see just fine. So I've been doing that and it's been phenomenal. I really like it. The clarity is good and it's just been really fun to use. It pretty much kills the segment, doesn't it? Yeah.
00:18:22
Speaker
Yeah. All right. Well, with that, maybe we can comment on that on the other side, but otherwise we are going to head straight into talking about something I feel like I've been mentioning for, I don't know, five, six, seven years now, which is identifying features and other things using drone imagery.
Professional Development for Archaeologists
00:18:41
Speaker
We'll be back in a minute.
00:18:43
Speaker
Looking to expand your knowledge of x-rays and imaging in the archaeology field? Then check out an introduction to paleoradiography, a short online course offering professional training for archaeologists and affiliated disciplines. Created by archaeologist, radiographer, and lecturer James Elliott, the content of this course is based upon his research and teaching experience in higher education.
00:19:02
Speaker
It is approved by the Register of Professional Archaeologists and the Chartered Institute for Archaeologists as four hours of training. So don't miss out on this exciting opportunity for professional and personal development. For more information on pricing and core structure, visit paleoimaging.com. That's P-A-L-E-O, imaging.com, and check out the link in the show notes.
00:19:23
Speaker
Hi, welcome back to the Architect Podcast, episode 174. Chris, just before we went to break, you were talking about this rocket AR glasses, and you compared them to the Oculus. And I was going to ask you the question. You kind of touched on it, but if you could go into more detail, I'd be curious. You got that Oculus specifically because you didn't have a good place to put a good-sized monitor in your RV. You were talking about putting it up on the dashboard, and it would overheat. So the Oculus allowed you to have multiple virtual screens.
00:19:52
Speaker
The last we talked about it, so I didn't get to play with it when I was in Nevada. The last we talked about it, it was working really well for you as an external monitor. I don't know if you're still using it like that, but if you could just give me a little bit of a sense of how these new glasses compare versus the old glasses in that use case.
00:20:15
Speaker
Well, the first thing is the new glasses I can plug into anything with a USB-C port. Technically, you can plug them into like, you know, a PlayStation or an Xbox. I could plug them straight into a TV if you wanted to. I don't know why you would, because the TV is sitting right in front of you. But you can plug it into a phone, tablet, and I've done both of those things and it works really well that way. The Oculus Quest, you simply can't do that too. It's not something that you can use as a monitor in that sense. Now, there are ways to do that if you have a PC and some remote desktop stuff, but
00:20:45
Speaker
It's convoluted and doesn't make any sense for most users. The other thing with the Oculus, now the digital office that I use is called Immersed VR, and there's a number of them out there, but they're all basically the same. The Oculus is all-encompassing, so if you're not super good at touch typing without looking at the keyboard, even though you can get an on-screen virtual keyboard and there's ways to do that with
00:21:07
Speaker
certain LG keyboards will go into VR, so to speak. And you can kind of draw the keyboard out in immersed, but it's not super perfect and it's hard to get used to, to be honest. But the benefit to the Oculus is I can spawn five displays, one of which is my primary display and then four additional displays. And I have full control over where those are, whether they're portrait or landscape and how big they are and all that stuff.
00:21:32
Speaker
with this rocket air glasses, I basically just have the one display on 120 degree field of vision, and that's it, right? So if I'm able to use it as a second display, then that is a true second display with pretty decent clarity, or I can just use it to mirror whatever I'm looking at, whatever device I'm on. Now, the one cool thing is, with my iPad,
00:21:53
Speaker
I hooked it into that, and it was just mirroring the iPad. But when you run the brightness all the way down on the iPad, you can still see the screen. It's super dim, but you can still see the screen. When I run the brightness all the way down on my 2021 MacBook Pro, the screen goes black. It literally turns off the screen. And if I'm mirroring that display, well, I can now use this with my AR glasses in a coffee shop on an airplane, something like that, without having anybody peer at what's on my screen.
00:22:20
Speaker
Because we've all been sitting on an airplane and look up between the seats and going, what is that guy doing with those spreadsheets? His formula is totally wrong. But they're just getting ready for a high pressure meeting and I'm just looking at your stuff.
00:22:37
Speaker
The privacy aspect is greatly enhanced with this. I like that. And they're lightweight. They don't have a battery that'll die. It does suck battery life from your device. But I feel like the field applications to this could be pretty good depending on the data that's coming into them. If we can get some AR out-type apps and I can just pull these glasses on while I'm standing in the field and visualize all the data I'm looking at in an augmented reality sort of way, that's kind of where I'm hoping technology like this goes.
00:23:06
Speaker
So, is it over one eye or both eyes, the virtual screen? It's stereo, so both eyes, and it brings those images together. And one of the things when we first started talking about augmented reality glasses was the Terminator overlay, right? Get information about what you're looking at. Would that be a possibility with these?
Augmented Reality in Archaeology
00:23:28
Speaker
Hook it up to your phone and have, while you're doing your survey or whatever, have an on-screen readout.
00:23:35
Speaker
I don't see why not. If the application was designed in a way that really was like an overlay where the middle of the screen was left blank, you can see through the image. If I put up a white window or something like that with a white background, it's really hard to see through that. So you'd have to design something that really does, if it truly has a transparent background,
00:23:57
Speaker
and no desktop image or something like that, and then you just have the overlay information around the edges, absolutely it would work. There's no reason why I wouldn't. The thing is, it's not interacting right now with the actual environment. You're seeing stuff that's coming from another device, but it's not like identifying mountains in the distance or able to look at something and get information on it, something like that. It's only passive. It's just displaying what's coming from its own device.
00:24:24
Speaker
Well, I could see that being useful for biometric information, temperature, which you get off your phone, altitude, GPS coordinates, a number of things like that. If it was, it would have to be designed in a way that it doesn't interfere with what you're actually trying to do, but just kind of augments it. No pun there, but off to the side.
00:24:48
Speaker
As though your reality were enhanced or augmented. It's crazy, right? It's really weird. How about that? You know what would make these glasses really good though, is if we had a lot more data, imagery data that we could use to maybe
00:25:07
Speaker
then identify things just by look. Could you imagine looking out on the landscape on a fresh landscape that you're just about to survey and having your glasses, your device that you're using, start identifying previously unknown archeological features? That is the future. That's a heads up display, right? Exactly. So the same thing like the fighter pilot looking through the windscreen and seeing the enemy planes being identified and picked out, even though there's just little dots off in the distance.
00:25:35
Speaker
Yeah, exactly. Except it's actually doing real-time identification. So I think that's probably a number of years down the road, but the foundation of stuff like that is in the paper. We're going to finally talk about halfway through this podcast. So Paul, why don't you see that for us? Okay.
Deep Learning in Archaeological Detection
00:25:53
Speaker
So this paper, I'm not sure how I found it because it's in a journal called the Journal of Remote Sensing. So it's not an archaeological journal per se.
00:26:01
Speaker
The lead author on it, Mark Altaweel, I follow him on Twitter, so I assumed that I came across this article based off of a post of his online. Yeah.
00:26:12
Speaker
And the article's title, if you didn't get it from the title of this episode, is automated archaeological feature detection using deep learning on optical UAV imagery for preliminary results, which is one hell of a mouthful. But it's interesting. We've talked about other kinds of deep learning, identification of visual information and photographs and such to find
00:26:39
Speaker
There's been discussion, not that I think we've ever talked about it, but there certainly has been quite a bit of work over the years on using pattern recognition for identifying sites. And that's basically what they're doing in this. And they're describing their particular process, and not just the process, but also their software, because it's written by at least the lead author, but there are a half dozen people on this paper.
00:27:06
Speaker
from the Middle East, from Germany, and from the UK. I'm not sure how much they all contributed to this, but clearly it's a collaborative project that they're working on. They have their software. Their software is on GitHub. The article
00:27:22
Speaker
especially in the conclusions, the discussion and the conclusions, goes on at length about the need for more access to more data. And so I just thought it touches on a lot of the things. A lot of the article, frankly, again, went over my head. And I think part of that is because there are a few terms that aren't defined in it. But to tie this back to what I said early on about the architecture detection,
00:27:47
Speaker
subsurface at Lagash, I'm looking at various kinds of optical recognition programs to try to get a sense of if that's going to be something useful.
00:27:57
Speaker
at Lagash for finding these buried structures. So I just read an unpublished article that Emily Hammer did, and it will be published, it's excellent. It was on Lagash, on work that she did a couple years ago there. And part of the project that she did was tracing a lot of these buildings, these walls. And she did it in a manual process.
00:28:20
Speaker
playing with the contrast and the histograms for the imagery and seeing where she could see walls in certain pictures or where she could see them in multiple and then have greater confidence. And I'm thinking that it would be interesting to try to do the same sort of approach, but in a machine learning environment. I know Jack about machine learning, but
00:28:43
Speaker
But I listen to a number of Python podcasts and GIS podcasts and I read this. I mean, I'm adjacent to these worlds, so I need to start learning this. And I think that this is going to be a good project for me going forward. Anyhow, this article caught my attention because in that title, which mentions a bunch of different words,
00:29:07
Speaker
A bunch of those words are things that are very adjacent to things that I either do or want to learn about. Yeah. Well, this is really cool too, because like I said, when we tee this up here, I mean, I, I personally first started thinking about stuff like this when we worked in, uh, at China, like never weapons center back in 2015. And I was just thinking, man, this is so dangerous with the snakes and the bombs and, and just the environment alone.
00:29:35
Speaker
that if only we could have some sort of drone imagery and then be able to actually identify stuff using that drone imagery. In the very limited sense, I was thinking of just visually looking at video imagery that was taken like high resolution video imagery and doing that instead of survey. So you could zoom in and try to see stuff and figure things out.
00:29:56
Speaker
But of course, if you can teach a computer to do that, it's going to be way better than the human eye is ever going to be at some point. In fact, when I was reading this article and they were talking about training these models, and I kept thinking of a movie scenario where the researcher is getting frustrated because it keeps training and trying to tell this thing and the computer keeps coming back with incorrect responses saying, oh, you've got features over here. And I was like, no, we don't because I didn't give you those images.
00:30:22
Speaker
And then all of a sudden the computer's like, you know, the guy realizes, oh man, the computer's been right all along. We've missed this thing forever because we couldn't see it and the computer can. And I was like, that plot just writes itself. But I think that's what's going to happen, right? Like we get these things smart enough to do that pattern recognition on the types of patterns that we're looking for.
00:30:41
Speaker
I mean, we really should be able to not just identify things in a quicker way than we're doing now and cover a lot more ground, but my hope is we'll be able to find stuff we didn't even know existed and maybe didn't even know were human-created features until we had these algorithms to identify
Improving Deep Learning for Archaeology
00:30:59
Speaker
them. You know what I mean? Yeah. And a lot of this article is actually about building
00:31:05
Speaker
the pattern recognition into the system. So it still isn't as good as what people do, but they're trying to make it get there. And so it's interesting from that point, because it's very much a work in progress. Fantastic kudos that they have the code available online. I tried downloading it and running it, but getting QT, which is the graphical interface working on my Mac, was just a little more than I had the time to deal with right now because of everything else going on.
00:31:40
Speaker
flavor of Luthenix of choice. So I'll grab an Ubuntu box and try it there. And they do, you know, back to your thing about having the augmented reality glasses, you know, we're not there yet because, and they discussed this in the article, they're talking about doing some of this on, you know, whatever computer you have handy, but then also offloading a lot of the learning and the pattern recognition to higher performance computers and the clusters.
00:31:57
Speaker
But I will at some point. The screenshots are all on Ubuntu, which is my
00:32:04
Speaker
So we're not there yet. That's not going to be on your glasses for 10 years, that kind of programming power, processing power. But it will be there eventually. And there are also ways, just like what they're talking here about moving workflows between one device and another, there are going to be ways of offloading some of this processing power to stuff that you're not physically carrying around with you. And maybe it's through that Starlink link back to a supercomputing cluster someplace else in the world.
00:32:34
Speaker
But it processes the images, uploads it, the actual intelligence happens somewhere else, and then sends it back very quickly so that you can identify what you're looking at. I'm actually going to sidetrack this again because that seems to be my job lately.
00:32:50
Speaker
When you were talking about identifying things that we don't see with our eyes, I was listening to a webinar yesterday about a historical archaeology project in Rhode Island, and they showed some LiDAR imagery onto which were laid the maps of what they were doing, of where their trenches were and so on.
00:33:10
Speaker
It looked nice. I thought, oh, LiDAR imagery. I forgot to think what kind of LiDAR imagery is available for my area. I was working on a project next to the Hudson River back in October, November, December, so either side of my last trip to Iraq.
00:33:27
Speaker
I decided to go on the New York State's GIS department's website, and I found that they do have links for various kinds of imagery of Dems and such. I downloaded a USGS one-beater process Dem, four different tiles of it.
00:33:45
Speaker
pull them together, merge them so that they're all on the same histogram and drop them over the area where we're working. I looked at them in black and white, the gray scale that you normally get from a dam. I'm like, okay, fine. I changed the color. Then I went, I normally, well, let's see. Then I went and I did a contour view in QGIS because now that's just a visualization.
00:34:08
Speaker
turn it on like you change the color ramp. Then that looked interesting. Just for yucks, I put it on the hillshade model view. Normally, I don't like hillshades. I have fundamental opposition to them because the way that they look ... I mean, it's stupid, but the way that they look normal is with the sun
00:34:29
Speaker
the lighting source being up to the top, either top left or top right. But in reality, the sun should be in the south. But if you light a hillshade from the south, everything looks inverted. Hills look like valleys and valleys look like hills because of the way our brains process where light is supposed to come from. And so I never use them. But I put a hillshade on and holy crap.
00:34:53
Speaker
I could suddenly see these hundred-year-old roads that we found relics of, clear as day. All sorts of details. I was just kicking myself that we didn't do this before going out into the field and doing our test trenches and the like.
00:35:13
Speaker
we could have targeted so much better in so many cases where we actually dug and did our work had I seen this kind of imagery before actually going out in the field. Because all the other imagery that we had were either historical aerial photos like Google satellite, Google Earth satellite and Bing satellite and such, which you couldn't see through the tree cover. But this dam was at one meters, you could absolutely see all sorts of great things.
00:35:41
Speaker
And see, this is what I think the authors of this article are getting to. And as we go to break here, I'll just have one last thought on this and then we'll go to break. But I feel like as archaeologists, I would love to get to the point where
00:35:57
Speaker
Honestly, if we're talking about automation, the drones would probably be automated too, but basically we send out the fleet of drones to take our various imagery, not just visible imagery, but like you were mentioning earlier, doing some infrared, multi-spectral kind of stuff and do all the imagery. And then probably at that point, if we're capable of doing all these things automatically, realistically, probably sending the information back to satellites in real time, but at the very least downloading it when you get back
00:36:22
Speaker
and then running that imagery through all the processing stuff, but basically just hitting a button that says, yeah, do all these things. And having the computer do that, rather than us having to think, oh, let's go in and invert these colors and try this and do that. And like the one researcher you mentioned, that was, you know, messing with the histogram. We do that with rock art all the time, right? What is the name of that program?
00:36:45
Speaker
There's a program you can download. Yeah, it's escaping my mind right now. I know it's on my iPhone. But you use a program to basically change the colors because when you're using old paints and different peckings and stuff like that, sometimes the human eye just can't see it. And you need to change colors and do stuff to be able to see stuff.
00:37:04
Speaker
But if we had the right patterns in there, which is what these researchers are getting at, and we had enough of those in there that it could learn, then it could run through all these different things on its own and come back and say, listen, in this, I found this. In this, I found this. And in this, I found this. Go figure it out, Mr. Archaeologist or Mrs. Archaeologist. So, you know, or rather, doctor. Sorry. Doctor. Yeah. Don't use the gender language. Jeez. Yeah. I know. I know. Right. So anyway.
00:37:32
Speaker
That sounds like a good point to stop. Let's wrap up this article because I like where it's going. It's getting in some pretty cool spots and we'll talk about that on the other side of the break. Welcome back to episode 174 of the Archeotech podcast. And we are wrapping up this discussion that got started a little late, but basically
00:37:49
Speaker
archaeological feature detection using deep learning and on optical UAV imagery preliminary results is the name of the article. Check it out in the show notes over at archpodnet.com forward slash archaeotech forward slash 174 if you want to see the article. And the article links to all the supplemental material like Paul mentioned and just like a lot of good stuff. So go check that out. But yeah, when we were headed to the break,
00:38:13
Speaker
We were talking about, I guess, the possibilities around this. So you said, you used the term hitting a button just before we went to break. And that got me thinking, that's actually really apt in terms of this article because the authors are keen on accessibility. They want the imagery to be accessible, to be publicly accessible. They want the code, the source code of their project to be publicly accessible. They wrote about it.
00:38:42
Speaker
Even though it's in preliminary format, they wrote about it in an open access journal. Accessibility is a really big thing. One part of the accessibility that runs right through it is that most of the time when we've looked at computer vision kinds of projects, it's very code heavy.
00:38:58
Speaker
I've played with some of these before and you get in the weeds pretty quickly working with whatever packages they've got. Oftentimes it's in Python, so I have some access to it because that's kind of the way my brain works. But still, it's very code-based and very code-heavy.
00:39:16
Speaker
And depending on the programmer, what your different inputs might be, what your variables might be that you have to set, what the settings are, whatever you want to call them, might be very obscurely named. And that's not the route that they went down here. They actually built a GUI. Now, the GUI is exactly what stopped me from working with it on my Mac, but again, I'm not afraid of it. With a little bit more time, I'll either get it working on my Mac or I'll get it working.
00:39:43
Speaker
you know, one of the zillions of Linux boxes I've got around, I just don't have in front of me today. And they explain what a few of the settings are. And say, you know, you could do a setting of 100 in water, you could do a setting of 50, depending on, you know, these various parameters. And that's in the article that I explained. But basically, it's just it's
00:40:03
Speaker
picking a number off in a dialogue box on a GUI. And then the training part, the biggest part of the training that happens is human interaction, tracing the outline of a feature and saying, hey, this is that kind of feature. And here's another image, and here's that same kind of feature, and here's a new outline, and then letting the computer do the work and trying to interpret what's the same about these various photos.
00:40:30
Speaker
what's the same about the features that have been highlighted with them. And so again, back to the notion of accessibility, it's trying to be accessible by not hiding because the source code is there, but not forefronting the code.
00:40:45
Speaker
Making it so that with minimal training, you or I could go and take this and train a set of images and see what kind of data we get back. Again, it's still in process, so there's, I think, a lot of refinement that has to be done on the actual process of the machine learning to identify these properly, which is something else they talk about, the hits and the misses.
00:41:10
Speaker
But that's a positive way to go. I mean, you and I are both Apple users, right? And that's when Macs first became very popular was as the simple things to do. You didn't have every setting under the sun, which really upset certain people. But most of the time, most of the settings are there in the way that most of the people are going to use it. Yeah, which is the whole point, right? Yeah. Yeah.
00:41:37
Speaker
Yeah. That's where I would love to see, I hope they really get this done and they can get enough data in there to actually make this thing happen, which was kind of the other huge point of this article was there's not enough imagery to put into these models for that to learn. I mean, they need, they mentioned even at one point, I think hundreds of thousands of images of different things in order to get the, uh, the neural network, the convolution or neural neural network to actually learn
00:42:03
Speaker
what these are. And we all know that's true because you could have a hundred pictures of the same thing and it's all going to look slightly different, right? So we need different angles, we need different conditions, different all kinds of things for that to work out. So that was one of the things I liked that they mentioned is that they want researchers to share their data and for academic researchers, even incentivizing them with, I guess, credit, academic credit,
00:42:29
Speaker
for actually sharing the research, not just completing the research, but you get a little extra if you share it in open repositories. Yeah. And then for reference, they looked at three different kinds of sites. So they looked at structures, they looked at what they called mound sites, which they didn't describe in great detail, but I'm assuming they mean tell sites.
00:42:51
Speaker
which is a particular site formation process that's very common in the Middle East. That's the other reason why this interested me because their examples are all in Middle Eastern archaeology, Arabian archaeology, so stuff I'm familiar with from my own training and stuff that interests me.
00:43:08
Speaker
And then the other kind of sites, they did structures I've mentioned, khanats. Khanats are a kind of aqueduct basically that's common on either side of the Persian Gulf. And then also I've seen them in Yemen, so in southwestern Arabia as well. So you can kind of imagine a sort of east-west band across the southern tip of Arabia over into Iran.
00:43:37
Speaker
It's an underground aqueduct that every so often you have an access hole that goes up to the surface. It'll be a very gradual slope to the channel that's underneath the ground and then periodically every 50 feet, 100 feet, whatever it is, a hole that goes back up to the surface for
00:43:59
Speaker
digging the canal and for keeping it clean. Interestingly enough, of the three different kinds of sites that they were picking, that's one that I thought was going to be the easiest to identify because it's circles, it's holes, so it's very dark center, and they're linear. They're linear and they can stretch for miles.
00:44:19
Speaker
That's one that they had lots of mishits, things that were getting misidentified as the wrong site, not being identified properly. That does suggest that there's a lot of room for improvement there on the actual algorithms of that neural network, how they're building that out. Way above my pay grade to figure out what's going on.
00:44:43
Speaker
I know some of the terms, and if you ask me to actually describe what they mean, I'll sound like that kid that didn't read his homework and is trying to write the book report, which is basically what I am right now with this.
00:45:01
Speaker
but it was interesting to see these different things. And so they had, you know, roughly a hundred to 200 photos that they ended up using of each of these different kinds of structures or sites rather sites and structures. Um, so, which that is in and of itself that they were able to gain some, you know, a little bit of success with, with that limited number of photos is pretty encouraging to be honest.
00:45:23
Speaker
Yeah, and then that brought up some questions about the difference. If you had more photos, would you want more photos of the same few things so that you can really dial in what this particular kind of structure looks like, this particular kind of knot, or do you want a whole bunch of different things showing the variability, which is the better strategy? I don't know. They don't go into that, but it is something that's always in the back of my mind.
00:45:48
Speaker
Yeah. Well, in the last few minutes of this podcast, I want to talk about, because they briefly talked about, you know, where future directions should go here, but I want to talk about the far future and what this could mean. And what I mean by that is, you know, in the last week or so, I was finishing up a book and I might have to look at my audible account and see what that was. I don't remember what the name of it was, but it was about
00:46:11
Speaker
artificial intelligence and what designing artificial intelligence even means, right? Like what kind of rules do we apply? And what are the, if you really think about these experiments, like what, and the rules that you could give an artificial intelligence around service to humans and things like that. Well, where could that go wrong? And, you know, not in a weird sci-fi way, but in a realistic way, where can this actually just go wrong? And it makes me think not really down that road, but this is leading to some sort of an AI
00:46:39
Speaker
or a computer program that says, I'm looking at all this data and here's all the things that I think are features, right? But once we teach it that and it gets really good at that and people start using it more and we just keep feeding it and feeding it and feeding it and feeding it, like anything you feed too much, it'll just get way too big. And then now,
00:46:58
Speaker
it's got all this information. So what's to stop it from starting to, you know, stopping people from starting to input information, like actual interpretation, you know, say, well, every time we see this, it means this. And every time we see this along with this, it means this. So that's not too far fetched, which means, you know, at some point, what's the archaeologist for?
Future of AI in Archaeology
00:47:20
Speaker
I mean, we send out the drone data. The development company here in the United States says we're putting up a super mega Walmart over the entire state of Idaho, so we need to go out there and do some survey because that's still a law, luckily, in 2300.
00:47:37
Speaker
And it sends out the auto drones or the super high resolution satellites that they can access right away, and then brings back all that imagery to the computer system. It says, we found this, this, and this. It sends out things to excavate or mitigate, and then you're done. The real jobs that CRM archaeologists have is really just to help construction companies know what to avoid. And if they can't avoid it, then we have to excavate it. That's what an archaeologist really does.
00:48:02
Speaker
When it comes down to interpretation and stuff like that, it's not even really technically part of our jobs. I mean, it kind of is to an extent, but not really. And I'm just wondering if the entire job of an archaeologist could be taken up by this in the future. I'm going to bring our question. I'm not going to even try to answer that.
00:48:19
Speaker
I'm going to bring another question that's in the closer future because another thing that struck me reading this, it's UAVs. We normally think of drones when we think of UAVs and we normally think of quadcopter drones in particular. I was like, well, most of the work I've seen with quadcopter drones is
00:48:41
Speaker
site level, right? It's not to identify where the sites are, it's to look at features on a site. But this article is clearly geared at larger swaths of landscape. And I was wondering how this works, you know? So do we need just like, you know, old spy photos, you know, the coronas? Would it work with satellite imagery? Is the resolution, the detail good enough now? I don't know. But most of the drones aren't going to be flying high enough to get a good landscape view.
00:49:10
Speaker
And then when I clicked on Mark's Twitter account just to see if I could find that article, I didn't find it, but I'm sure that's where I got it from. I saw a photograph that he'd posted on his account of a fixed-wing drone. And I thought, oh, of course, duh.
00:49:28
Speaker
That's what he's meaning by the UAVs. This thing was terrifying. It was in the office there and it had at least a 10-foot wingspan. Supposedly can be up in the air for four hours. When he writes about UAVs, it's on a slightly different scale than what I've been using so far.
00:49:53
Speaker
Well, landscape imagery should always be done with fixed wing at this point, right? Because the quadcopter drones. I would think. Yeah, the quadcopter drones are just too heavy, really, because they don't have any real flight characteristics. They're purely being held aloft by the force of the engines or the motors, I should say. And exactly, right? It's just a brick. But the really nice UAV, in fact, Trimble, I think, makes one. Trimble makes a fixed wing UAV that has RTK, sub-meter RTK,
00:50:23
Speaker
accuracy within the device. And it's got something like a six or eight foot wingspan. I saw it at a thing I was at a few years ago, but fixed wing drones are the way to go. Now they are subject to a lot of the same limitations, of course, that physics puts on aerodynamic things like
00:50:38
Speaker
They're, they're really light usually, and they have electric motors, not gas motors, which that lightness of them obviously makes them really subject to wind. And you know, with fixed wing, you do tend to go a little higher and you go a little farther away and you go for a lot longer period of time. So you really have to be cognizant of what the wind is doing to you. Yeah. You also go with the higher you go faster, so you capture more in the photograph, but you have to go higher so that at that speed you don't get motion blur.
00:51:08
Speaker
Exactly. I think all those problems are going to be solved just by increasing technology in other fields. We get batteries that are lighter and faster. We get materials that are better so we can probably make the thing a little bit heavier because we got better batteries and motors.
00:51:23
Speaker
Then we get better cameras, so you could fly a little bit lower, but maybe the camera's so good that you can fly higher and it's got just an amazing resolution that the computer can see. Then you're talking about satellites. Is the resolution good enough? Yeah, it's probably not what the ones we have access to, but we know damn well that there are government satellites up there that can count your nose hairs. As soon as we get access to stuff like that, because they've got even better stuff,
00:51:49
Speaker
then that's going to change everything. I think actually access to satellite imagery, that's not... There's one of the things I mentioned in the article is one of the downsides of satellite imagery is it's not representative of now. It could be representative of a year ago or two years ago. Yeah, right. So if we have access to real time satellite imagery or at least near real time, like within the last few months, we can
00:52:12
Speaker
book time on it and say, hey, it sends you back an email, says, great, the satellite passed over and took all your imagery from this date, and it's within the last three or four months. That would be fantastic. And then we would be able to probably not use drones ever again for landscape, escape imagery like that.
00:52:29
Speaker
I don't know. It's an exciting time. Well, I'm going to throw another wrinkle that this wasn't mentioned in the article, though I did really appreciate that concurrency of the imagery. Your UAV stuff, most archaeologists are using them. It's photographs that were taken today, yesterday, during the project, as opposed to whatever other imagery you have that might be Corona stuff from the 60s. It might be Landsat stuff from the 70s or 80s, but it's definitely not current.
00:52:57
Speaker
Yeah. One big advantage that was not mentioned in the article of the UAVs, of the drones, is that you're generally under the weather. Yeah, that's a good point. So satellites, that kind of real-time satellite stuff, and that's used a lot for forestry, for agriculture, for a lot of that sort of thing, is very subject to cloud cover.
00:53:19
Speaker
And so you might have a stretch of time that you don't get good imagery of the ground because of cloud cover. You don't have that problem with the drones, generally speaking. So, you know, there are pros and cons to everything.
00:53:33
Speaker
And that's probably why, as we've said on virtually every single episode, it's all about a suite of tools at your disposal. So use the one that might be the most reliable. Try to use that the most, and then use other things to fill in where that's less reliable. I would imagine high resolution satellite data would be the most reliable thing if we had access to that kind of satellite data. But then also, like you said, cloud cover, different conditions, whatever the case may be.
00:54:02
Speaker
then you can fill that in with other stuff. So, or maybe right now drone is the most reliable thing. And we can fill that in with satellite imagery data in certain circumstances where I don't know, maybe for one reason or another, we just couldn't get good drone imagery of it. So yeah, like for example, places that we want to study that are having hostile action right now, or, or you just can't get in there for various political reasons, but we have good satellite data and we can do that. I mean, you get into a bunch of ethical concerns there that have to be navigated, but
00:54:30
Speaker
It does open up different avenues for research. All right. Well, I think that's about all we have time for. Again, if you ever have any comments like our friend James, please send us an email, chrisatarchaeologypodcastnetwork.com, paulatlugol.com. Both of those are in the show notes for every episode. And then so are other contact information like our Twitter handles and
00:54:56
Speaker
I mean, leave us a comment or something on iTunes, like a review, but don't try to ask us a question there because nobody reads those. I look like once every six months and it's not a good place to ask questions. I've seen questions in other podcast reviews. They're like, hey, on this episode, this, this, this, and this. These are for reviews, not questions. Otherwise, anything more to add, Paul?
00:55:18
Speaker
Just the one thing is that I'm glad that this exists because I'm going to explore it a little further, not on the site identification, but like I said at the start, I think this might be a good intro for me on doing intra-site inspection with my aerial photographs. I'm glad it exists.
00:55:37
Speaker
With that, I think we'll be out and Paul, you are headed off to Iraq. So this is the last episode with you for a little while. Um, we do have an interview coming up, talking about AI and robotics. That's going to be pretty cool. That's going to be in the end of March. We'll, we'll release that, but, uh, I'm sure we'll have some other episodes regardless of Paul's absence. And then we'll get to talk about Iraq when he gets back. So yeah, hopefully I've got some good, uh, some good new data to talk about.
00:56:04
Speaker
Yeah, absolutely. All right, well with that, we'll see you guys in a couple weeks. All right, take care.
00:56:14
Speaker
Thanks for listening to the Archaeotech Podcast. Links to items mentioned on the show are in the show notes at www.archpodnet.com slash archaeotech. Contact us at chrisatarchaeologypodcastnetwork.com and paulatlugol.com. Support the show by becoming a member at archpodnet.com slash members. The music is a song called Off Road and is licensed free from Apple. Thanks for listening.
00:56:39
Speaker
This episode was produced by Chris Webster from his RV traveling the United States, Tristan Boyle in Scotland, DigTech LLC, Cultural Media, and the Archaeology Podcast Network, and was edited by Chris Webster. This has been a presentation of the Archaeology Podcast Network. Visit us on the web for show notes and other podcasts at www.archpodnet.com. Contact us at chris at archaeologypodcastnetwork.com.