WEBVTT 00:00:04.000 --> 00:00:12.000 Hello, everyone! And welcome to the final Thunder Talks. Thunder Talk battle dome commences now. 00:00:12.000 --> 00:00:16.000 No. 00:00:16.000 --> 00:00:17.000 Okay, well, I guess I could. Oh, okay. 00:00:17.000 --> 00:00:18.000 I'll just give a quick introduction, Kevin. Sorry. 00:00:18.000 --> 00:00:26.000 So we're gonna have a our final series of thunder talks of first set focusing on geology and imaging. 00:00:26.000 --> 00:00:30.000 And then the second set following that we're gonna what, which will be more on ground motions and that sort of thing. 00:00:30.000 --> 00:00:43.000 I know that I was gonna take a break after Claire's talk, but we'll actually have the break after Shawn's talk, so that Brad can make sure to make his cut off. 00:00:43.000 --> 00:00:48.000 So with that that further due Kevin 00:00:48.000 --> 00:00:56.000 Thank you. Well, I think, as everyone who's listening knows, the San Andreas plate boundary is complicated, there's multiple, active faults. 00:00:56.000 --> 00:01:00.000 It's evolved over time and the question is, you know, how is it formed that way? 00:01:00.000 --> 00:01:03.000 Why are the main faults where they are? Etc. 00:01:03.000 --> 00:01:17.000 And the key point I want to make today is that the lithosphere of configuration that we have, I would argue, is produced in the very early days of the formation of the Mendocino triple Junction and the initial formation of the San Andreas play 00:01:17.000 --> 00:01:25.000 Founder. So the first question we have to say is, where actually is that large scale with the spheric scale plate boundary? 00:01:25.000 --> 00:01:32.000 Luckily we can use the very detailed GPS data, because there have been no bigger earthquakes to sort of disturb it. 00:01:32.000 --> 00:01:38.000 And so it's mostly recording the deeper share. That's acting beneath the locked seismogenic zone of the play. 00:01:38.000 --> 00:01:55.000 Boundary, and what we see is that the plate boundary, either in map view on the right or if we look at cross-sections across the plate boundary is essentially centered on a corridor that runs from the makama fault in the north through the hayward and then 00:01:55.000 --> 00:02:00.000 into the main San Andreas in the South. So the question is, why is it occurring? 00:02:00.000 --> 00:02:06.000 Why is it forming even at its youngest point, far inland from the western edge of North America? 00:02:06.000 --> 00:02:21.000 And to do that, we can go back to the early days of the formation of the San Andreas Plate boundary, when the Mundasino and the pioneer faults first intersect in North America and based on the reconstructions there was a short period of time when we had a 00:02:21.000 --> 00:02:26.000 Pioneer Triple Junction, prior to the Mendocino, the Mpj. 00:02:26.000 --> 00:02:31.000 Forming, and in doing that there was a small fragment of the pioneer that's called the Pioneer Fragment. Here. 00:02:31.000 --> 00:02:35.000 That was accreted to the Pacific, and has traveled with the Pacific. 00:02:35.000 --> 00:02:51.000 Since then the question, though, is, Does it still exist? That was 30 million years ago, and if we look at the tomography we now have from the region, we can see a very nice image of this relic pioneer fragment as an Eastern extent of the pacific and beneath the western margin 00:02:51.000 --> 00:03:09.000 Of North America, and when we zoom in we see that it's eastern edge, which basically forms the shears on it depth with the slab window that develops in the wake of the triple junction migration is right below where these new fault system where the new plate bound reforms and 00:03:09.000 --> 00:03:12.000 That's the expected place for that shares own to form. 00:03:12.000 --> 00:03:17.000 So if we now sort of put this into context, let me go back. 00:03:17.000 --> 00:03:23.000 One I want. Twice we have the the main fault, the Makama, Hayward, etc. 00:03:23.000 --> 00:03:27.000 Are actually a new system that forms above this year's own. 00:03:27.000 --> 00:03:30.000 But we do have inherited faults like the Lake Mountain. 00:03:30.000 --> 00:03:39.000 Bartlett Springs, etc. That are probably associated with the previous seduction, but then decay over time. 00:03:39.000 --> 00:03:50.000 As this new system develops. So just to pull this all together, you know, why is it form, I think, because of the initial where it does because of the initial plate tectonics, it's inboard. 00:03:50.000 --> 00:04:10.000 And that's because there's this little section of a preated old subducted slab that's moves along with the Mendocino Triple Junction 00:04:10.000 --> 00:04:11.000 Alright! I'm excited to be here. Thanks for having me. 00:04:11.000 --> 00:04:20.000 How might we use fluvial terraces to study tectonics at the Triple Junction at the southern Csz. 00:04:20.000 --> 00:04:26.000 Is the Mtj. Where there is an overlap with the San Andreas and cascading faults along the Ill River is. 00:04:26.000 --> 00:04:35.000 Oh, I didn't advance. There we go along the Eel River is a train bounding rust. Fault 00:04:35.000 --> 00:04:42.000 That red line. The rest fault is a Usgs national seismic model source. 00:04:42.000 --> 00:04:59.000 And as a North virgin reverse fault at the boundary of the new gene to the north, and we can see that in this cross section from Bob Mclaughlin the rest fault there, and the needing contains a South Virgin press faults there's a fault here's a 00:04:59.000 --> 00:05:04.000 fault offsetting latest place seen to Holocene Ill River Terraces and tribely. 00:05:04.000 --> 00:05:06.000 You can see that. And here's a photo of that fault. 00:05:06.000 --> 00:05:16.000 Scarp. The buildings in this photo are all on the same tourist tread, so to drive the slip rate, we need the ages for the terrace 00:05:16.000 --> 00:05:21.000 So here's here are the terraces that I've mapped. 00:05:21.000 --> 00:05:22.000 And basically there are numerical ages for the terraces. 00:05:22.000 --> 00:05:39.000 Further to the north, in the Runnerville area, and so by mapping and quarreling these terraces, we can apply these ages from elsewhere to the terraces and shively, so I can calculate a slip rate. 00:05:39.000 --> 00:05:48.000 So we map these terrorist treads and we're gonna take a look at these terraces in the Runnerville and shyly area. 00:05:48.000 --> 00:05:57.000 And the T. Numbers T, one being the youngest terrace, are are applied in each region independent, and I also to compare tread heights. 00:05:57.000 --> 00:06:02.000 I constructed a relative elevation model. So all terrace elevations are extracted from this. 00:06:02.000 --> 00:06:08.000 R, em. 00:06:08.000 --> 00:06:18.000 And then I applied zonal statistics to plot the mean relative elevation from each track, and the Keller represents the regional terrace number. 00:06:18.000 --> 00:06:19.000 So the Runnerville treads are deformed. 00:06:19.000 --> 00:06:38.000 So this is north to south, and the vertical axis is vertical land motion, or is, is, is elevation relative elevation, if we zoom into the Runnerville area, we can see that these terraces are deformed within a sync line it's possible that the tread 00:06:38.000 --> 00:06:41.000 Formation is sent. Tectonic 00:06:41.000 --> 00:06:48.000 And then, if we look further to the south, between Runnerville and Shiveley, the terraces don't appear to be folded yet. 00:06:48.000 --> 00:06:53.000 They have an increasing relative elevation in the downstream direction. 00:06:53.000 --> 00:06:57.000 So there's possible evidence or tectonics. 00:06:57.000 --> 00:07:02.000 And I'm gonna skip the insight and end my talk. 00:07:02.000 --> 00:07:14.000 Thank you very much. 00:07:14.000 --> 00:07:18.000 That's it. 00:07:18.000 --> 00:07:30.000 Or a whole bunch more slides up. Sorry 00:07:30.000 --> 00:07:32.000 Alright! Hi, everybody! My name is Claire Devola. 00:07:32.000 --> 00:07:35.000 I'm a second year, Peach. She's soon an eccentric Barbara. 00:07:35.000 --> 00:07:40.000 And my research focuses on the question, does tamales base your photography record ancient earthquakes on the San Andreas fall? 00:07:40.000 --> 00:07:48.000 So in the slide advance. Hello! There we go, and night. So 00:07:48.000 --> 00:08:00.000 In 19 to 6 the San Francisco or San Francisco experienced 9 7.9 magnitude earthquake, which resulted in about 500 million dollars worth of damage then. And it's thought that there's a pattern to where these are structures occur so 2 studies of 00:08:00.000 --> 00:08:11.000 The sedimentary record and balleness. Lagoon and Bodega Harbor found 1,906 size earthquakes, about 400 750 years ago, indicated by substance events in the sedimentary record so this led them to believe that a 00:08:11.000 --> 00:08:23.000 Frequency of these ruptures, both between about 300 400 year intervals, however, the segmentation rates in these 2 bays are lower than the erosion rates, which means that more sediments being taken out than the deposited, and this becomes a problem when you want to preserve something in 00:08:23.000 --> 00:08:38.000 The sedimentary record. So this leads me to my study area of Tamales Bay, which is a submerged sag base sent along with San Andreas fault, and I'm looking at the head of the bay, which is inland from the open ocean and helping to prevent major erosion 00:08:38.000 --> 00:08:41.000 and has an average sedimentation rate of about 5 a year. 00:08:41.000 --> 00:08:47.000 This gives Molly's Bay the potential and the sediment to be able to record a better record of past seismic activity. 00:08:47.000 --> 00:08:51.000 So panels B and C of the figure on the right display the locations of some sediment cores. 00:08:51.000 --> 00:08:56.000 We've collected in 3 marshes of the bay 00:08:56.000 --> 00:09:10.000 And here's some results. So within multiple segment core transactions in the bay, we see a very sharp faces, chain which goes from a gray kind of shell-filled mud to a very organic rich brown mud which reinterpret to be a bay type environment overlaying 00:09:10.000 --> 00:09:18.000 A March type environment showing that the elevation of this area changed very rapidly or subsided to turn those March deposits into bay deposits. 00:09:18.000 --> 00:09:19.000 And here you can see some photos of what these deposits actually look like. 00:09:19.000 --> 00:09:38.000 So in the third panel from the left, in Core 13, you see a very sharp change in the core from that gray bay mud into a more organic rich mud, and while the other cores aren't quite as nice in the transsect that changes present and occurs around the same depth 00:09:38.000 --> 00:09:50.000 And so the next steps for this project are to use some modern form, manifold assemblages to really correlate these changes seen in the core with past environments, to determine if they really are subsidents. Events. 00:09:50.000 --> 00:09:52.000 So we already have that elevational surface transact, transact, collected, and counted. 00:09:52.000 --> 00:09:57.000 We're moving right along with that hopefully get some more information about the frequency. 00:09:57.000 --> 00:10:01.000 And extent of past ruptures along this section of the northern fan Andreas Fall, and the ability of Tamales Bay or Jacobi to preserve them. 00:10:01.000 --> 00:10:19.000 So thank you all for listening to our little update from you see, Santa Barbara 00:10:19.000 --> 00:10:24.000 Hi, everybody! This is Don Harp. I'm splitting this presentation with Rob Giveler Lci. 00:10:24.000 --> 00:10:38.000 I'm some of the California State water project. Samoquine River, Delta, West Tracy Falls investigation, and it's impact on the Clifton Court forbade dam in the Delta. 00:10:38.000 --> 00:10:47.000 So the California State water project begins a journey in Northern California, in the Upper Feather River Basin finds its way through Lake Orville, down the Sacramento River and into the Bay Delta area. 00:10:47.000 --> 00:10:57.000 Clifton Court forbay is denoted by the Red star you see there, and water then begins its journey south along the California aqueduct to the very end, and points in between. 00:10:57.000 --> 00:11:08.000 It's ended in Riverside, California. 00:11:08.000 --> 00:11:13.000 Why the West Tracy Fault investigation. It's the clifting court. 00:11:13.000 --> 00:11:17.000 Forbay is a critical point with this water project. It is the collection point in the Delta to the entire State Water project. 00:11:17.000 --> 00:11:38.000 California Aqueduct in 2,018. The Department Water Resources Directors damn Safety Review Board for Clifting Court for Bay recommended that the postulated fault Scarp of the West Tracy fault training beneath the dam in 2 Locations be investigated 00:11:38.000 --> 00:11:53.000 You have an idea exactly where this is off to the right of the screen is the city of Stockton, Clifton Court, for Bay is noted by the Red Star, and see a light red line there, denoting the approximate location with Tracy fall and to the left you see an enlarged 00:11:53.000 --> 00:12:00.000 Area Clipping Court for Bay, was Tracy Fall built the small yellow polygon in the upper left, noted as the Perry property. 00:12:00.000 --> 00:12:05.000 That's where our investigation it's located. 00:12:05.000 --> 00:12:09.000 This is what the Perry property looks like. The views to the north Northwest. 00:12:09.000 --> 00:12:11.000 This is the along the scarp. I'll show you a slide of that in a minute in Lidar. 00:12:11.000 --> 00:12:19.000 But what you're seeing is about a meter of relief from left to right. 00:12:19.000 --> 00:12:24.000 Really marked by the vegetation. Contrast right there 00:12:24.000 --> 00:12:32.000 What was done. So we began as anybody would begin an investigation with air photo, lidar and data review. 00:12:32.000 --> 00:12:33.000 To get onto the property required. Entry permits environmental clearance. 00:12:33.000 --> 00:12:41.000 Then we did site mapping on the Miss Mary Perry property north of Clifton Court forbake. 00:12:41.000 --> 00:12:55.000 We ran a 1,000 foot long comb penetration test across the scarp and did seismic reflection data that Roberts can talk about in a second and then report preparation and conclusions. 00:12:55.000 --> 00:13:04.000 This photo shows the Lidar of the of the scarp a Miss Perry's property, some of the most Premier undecided land in the Western Delta air area. 00:13:04.000 --> 00:13:09.000 The white line at the top is the first Cpt line done in 2,019. 00:13:09.000 --> 00:13:13.000 The black line is the seismic reflection, followed by additional Cpt. 00:13:13.000 --> 00:13:28.000 Work. This is what we first thought when we ran the first Cpt, the red line diagonal, we thought the fault came to the surface based on the upset of the surface, and then the refusal depth on the concentration. Test. 00:13:28.000 --> 00:13:42.000 And Robert's gonna tell you about the remainder of the investigation 00:13:42.000 --> 00:13:47.000 Thank you. Don. Good afternoon. Everyone. So it's John introduced. 00:13:47.000 --> 00:13:55.000 I'm gonna talk to you about some the segment reflection profile that we collected in 21 across the West Tracy Fall. 00:13:55.000 --> 00:14:01.000 This is a map of our study area just south of the town of Byron. 00:14:01.000 --> 00:14:11.000 Close to the intersection between the West Tracy fault in the middle portion of the line, and then the Midland fault on the left or seismic profile is shown in the middle part of the figure. 00:14:11.000 --> 00:14:21.000 It was 1,250 meters long, and northeast, trending across the west. 00:14:21.000 --> 00:14:37.000 Tracy Fault. Crossing the topographic limit that Don just discussed to the southeast of the line the geology, the bedrock geology includes Hillary and and Rollie formation which is dipping off to the north east, on the northeastern flank of the 00:14:37.000 --> 00:14:46.000 Diablo range. We also bring in bedrock control into the northeastern portal line, using the sprawl, one oil and gas. 00:14:46.000 --> 00:14:53.000 Well, just to the southeast of the seismic line, and then a series of Cpt. 00:14:53.000 --> 00:15:07.000 Cps along the existing size recline. This is an uninterpreted, interpreted prestack depth, migrated profile across the West. 00:15:07.000 --> 00:15:26.000 Tracy, fault. Here you can clearly see the northeast dipping monoclonal fold that is produced by the fault and you can see a very well imaged, well imaged fold access that the monoclonal fold axis that is projecting up straight up to 00:15:26.000 --> 00:15:30.000 The topographic lineage, and that Don just illustrated. 00:15:30.000 --> 00:15:35.000 We have imaging of the of that limb up to 300 feet. 00:15:35.000 --> 00:15:50.000 Depth below the profile, and we interpret the West Tracy fault as tipping out the you know 2 things to 2,200 feet below sea level. 00:15:50.000 --> 00:15:53.000 We can from the sprout. One line we can bring in the our tertiary units, including the dominant gene. 00:15:53.000 --> 00:16:00.000 Neroli and Telari formation, and we interpret Mr. 00:16:00.000 --> 00:16:06.000 Tiger fee, as young as the corporate clay within the profile, based on regional correlation of depth. 00:16:06.000 --> 00:16:19.000 In the next line in in this profile is a geologic cross section that was developed by Jeff on room that helps us understand the age of the initiation of folding. 00:16:19.000 --> 00:16:34.000 If we flatten on the base of the tilary formation we have, we, we estimate, the age of full initiation at approximately the the later Nerolian early calorie time. 00:16:34.000 --> 00:16:50.000 If we look at the shallow portion of the size book line, we see a a prominent late place to see on conformity marked in blue that appears to be faulted by a series of northeast dipping reverse faults that we interpret it as betting 00:16:50.000 --> 00:16:54.000 Parallel faults similar to the Oh, Neil! 00:16:54.000 --> 00:17:00.000 Fault systems. Further to the south, along the great valley, full system. 00:17:00.000 --> 00:17:06.000 If we interpret the uplift of that place, is seen on conformity. 00:17:06.000 --> 00:17:09.000 Assuming it's river bank or Modesto age. 00:17:09.000 --> 00:17:21.000 We estimate a vertical growth rate of between point 8 and point 1 2 a year and I'll end with a little cartoon that Jeff Vonner put together. 00:17:21.000 --> 00:17:27.000 The talk that illustrates how the folds the scarp would migrate through time as the fold growth. 00:17:27.000 --> 00:17:42.000 Thank you. 00:17:42.000 --> 00:17:44.000 Hello, everyone! I'm Jonathan Delph. 00:17:44.000 --> 00:18:01.000 I'm an assistant professor at Purdue University, and today I want to talk about a recently funded Nsf project to that allowed us to deploy notable sizeometers throughout the southernmost Cascadia foraker for this audience. 00:18:01.000 --> 00:18:17.000 I guess. Northernmost California. So this was based on some research we've been doing over about the past 5 years or so, comparing seismic velocity structure along the Cascadia Fork with other things that are heterogeneous along the margin like non-volcanic 00:18:17.000 --> 00:18:21.000 Tremor. So for reference. Here's Mendocino Triple Junction. 00:18:21.000 --> 00:18:24.000 Here's our California Oregon, a border to get you oriented. 00:18:24.000 --> 00:18:38.000 You can see these really slow velocity zones, that spatially correlate well with the distribution of non-volcanic tremor, and we interpreted these things as basically accreted sedimentary rocks that were infiltrated by fluids released from the 00:18:38.000 --> 00:18:41.000 Slab causing non-volcanic tremor. 00:18:41.000 --> 00:18:49.000 We followed this up in 2,021, trying to link some other geological and and geophysical correlations that vary along the margin with our seismic velocity structure, and I won't go into this. 00:18:49.000 --> 00:18:53.000 But this brought up a couple questions that we wanted to investigate further. 00:18:53.000 --> 00:19:03.000 One was was better understanding whether or Northern and Southern cascade are really analogous. 00:19:03.000 --> 00:19:07.000 In the north we have good constraints on lower crystal architecture, thanks to the Puget Sound, allowing us to do cruises and gather really nice reflection images. 00:19:07.000 --> 00:19:17.000 But we can't do this over the Klamath train to nearly the same resolution. 00:19:17.000 --> 00:19:34.000 The other thing that might be a little more interesting is that the preexisting models of where the plate interface is in northernmost California seem to disagree significantly with seismic images and low frequency, earthquake locations which would make it about 10 kilometers shallower 00:19:34.000 --> 00:19:40.000 below the forarch. So this is something we wanted to investigate further as well. 00:19:40.000 --> 00:19:55.000 So in 2020 we deployed 60 nodal seismometers throughout the northernmost California kind of encompassing the Klamath train mainly and overlapping our region of tremor shown by the purple bounds. 00:19:55.000 --> 00:20:02.000 You can see it's largely constrained to the 4 arc domain and the main goals of this was better. 00:20:02.000 --> 00:20:05.000 Characterize the architecture of the overriding plate. 00:20:05.000 --> 00:20:15.000 Better locate the plate interface, be able to disentangle a little bit of the deformation induced from San Andreas northward. 00:20:15.000 --> 00:20:27.000 Migration from subduction, related deformation, thanks. 00:20:27.000 --> 00:20:31.000 Alright! Hi, everyone! I'm gonna attempt to go through this with Peggy like speed. 00:20:31.000 --> 00:20:37.000 But yeah, today I'll be talking about canvas, which is an adjoint wave formography model of California and Nevada that we've developed. 00:20:37.000 --> 00:20:43.000 I'll talk about the model very briefly. But then talk about an application of it that I think might be of most interest to everyone here. 00:20:43.000 --> 00:20:47.000 So here is canvas again. It's an adjunct way from photography model of the entirety of California. 00:20:47.000 --> 00:20:50.000 It also includes Nevada as well, and we use a data set of 112 events to calculate our model. 00:20:50.000 --> 00:21:02.000 It's currently down to 15 s minimum period we're hoping to go a little bit below that as well and the depths that I'm showing here is at 5 kilometers depth. 00:21:02.000 --> 00:21:18.000 And absolutely yes, but one of the interesting things that we've been able to do with our model at 20 s was actually invert our the moment tensors for the sources that we used in the original study to get down to 20 s as well as some events that weren't originally included 00:21:18.000 --> 00:21:22.000 in our inversions. So I'll show an example here from the May 2013, event that occurred off of Santa Barbara of around magnitude 4.8. 00:21:22.000 --> 00:21:34.000 So we use Mt. Time just a time domain moment tensor code from Andrea Ching at Lawrence Livermore to invert our moment tensors. 00:21:34.000 --> 00:21:38.000 Here I am, showing the Gcm. Tensor as the red beach ball. 00:21:38.000 --> 00:21:39.000 The moment tensor from our study as the black beach ball on the ants. 00:21:39.000 --> 00:21:40.000 Moment A and Ss moment tensor is the Blue Beach ball. 00:21:40.000 --> 00:21:50.000 The red, inverted triangles are the stations that we use in our inversions. 00:21:50.000 --> 00:22:03.000 We pick those stations by Binning all available stations, by distance and azimuth from the source, and picking the station with the highest signal, to noise ratio, in each of those bins here's the waveform for that moment tensor inversion our solution is about 3 00:22:03.000 --> 00:22:06.000 Kilometers, more shallow than the Gcm. 00:22:06.000 --> 00:22:10.000 T solution is, and we get a variance reduction overall at 85%. 00:22:10.000 --> 00:22:14.000 One of the things that I want to point out is this waveform from Bk dot, Jcc. 00:22:14.000 --> 00:22:18.000 Which is nearly 800 kilometers from the source, and we're able to get an 84% variance reduction without shifting the waveforms at all. 00:22:18.000 --> 00:22:28.000 So one of the powers of using three-d models to do threed greens functions is their ability to fit far off stations. 00:22:28.000 --> 00:22:29.000 So to give an overview that was just one event. 00:22:29.000 --> 00:22:40.000 But of all of the events that we looked at. I've plotted here magnitude double couple, percentage, depth and variance reduction for our solutions, which are on the Y-axis. 00:22:40.000 --> 00:22:54.000 And then the values from the Gcm. T. Solutions on the X-axis, and so the overall kind of point from this is that we're able to get 54% higher variance reduction than Gcm, t, we see about 16% double couple percent or 16%, higher double 00:22:54.000 --> 00:22:55.000 couple percentage in the Gcm. T. Solutions. Our lower at the end. 00:22:55.000 --> 00:23:01.000 But we are closer to the local catalog solutions, and our depths also get partialer. 00:23:01.000 --> 00:23:18.000 And so, yeah, I will leave the conclusions up here very briefly. Thank you. 00:23:18.000 --> 00:23:23.000 I'm Bradley. Guard from the geologic Hazard Science Center in Golden Colorado. 00:23:23.000 --> 00:23:25.000 And I'm gonna present the first half of a talk. 00:23:25.000 --> 00:23:34.000 Sean audio present, the second and it's related to grand motion model analysis in the great valley of Central California. 00:23:34.000 --> 00:23:37.000 For background. The Us. Just national slides. Macowser model currently uses the 2,014 Nga. West. 00:23:37.000 --> 00:23:58.000 2 ground motion models in California, the basic amplification in the ground motion models is implemented using depth to the Vs equals one kilometer, which is known as Z one, and the Vs equals 2 and a half kilometers per second which is known as z 2.5 isa 00:23:58.000 --> 00:24:16.000 Services updates in the 2023 national seismic hazard model in Central California that we've updated this 1.0 and 2.5 models for the same San Francisco Bay region using the latest version of the Usgs San Francisco Bay seismic Velocity 00:24:16.000 --> 00:24:26.000 Model, and in the central valley we've created T. 1.0 2.5 models, using that seismic philosophy model as well as Usgs national custom model. 00:24:26.000 --> 00:24:38.000 And then next step that Sean is going to talk about is evaluating whether those the 1.0 2.5 models do a better job of giving us accurate ground motion models than the default back. 00:24:38.000 --> 00:24:51.000 And for this study we've compiler gunwash, a data set that includes over 7,000 records from 250 earthquakes and 200 stations 00:24:51.000 --> 00:24:59.000 This slide on the left shows the depth to the one kilometer per second Isis surface, and on the right the depth to the 2 and a half kilometer per second. 00:24:59.000 --> 00:25:03.000 Isis surface and the colors correspond to the shading of that depth. 00:25:03.000 --> 00:25:14.000 You can see relatively uniform depths to the lights of surfaces with a deeper part of the basin down here in the southwest corner west of Bakersfield. 00:25:14.000 --> 00:25:20.000 In that region we have very good coverage from a bunch of temporary stations and overall. 00:25:20.000 --> 00:25:25.000 Our data set includes temporary stations. Normally, California has network stations. 00:25:25.000 --> 00:25:30.000 Berkeley Stations Southern California Seismic network Stations. 00:25:30.000 --> 00:25:34.000 Our earthquakes span basically in circle most of the great valley. 00:25:34.000 --> 00:26:01.000 There's a few earthquakes along sort of the western edge. And I don't show all the direct weight paths from source to site, because it basically fills this entire volume with just black lines. And now, Sean will talk about our ground motion model analysis. 00:26:01.000 --> 00:26:22.000 Alright, thanks, Brad! Thanks everyone for making me laugh. So after Brad compiled the ground motion data set process the records and computed intensity measures of interest, particularly peak ground exceleration and inspector acceleration at various oscillator periods we do that with 2 00:26:22.000 --> 00:26:29.000 separate these 2 separate intensity measures are computed using the ground motion models from Ngos 2. 00:26:29.000 --> 00:26:32.000 For shallow crustal earthquakes, with 2 separate sets of data, use either the default Z. One or Z. 00:26:32.000 --> 00:26:40.000 2.5, which is the the function of based on vs. 00:26:40.000 --> 00:26:44.000 30, where you call? Would they have a correlation between vs. 00:26:44.000 --> 00:26:51.000 30 and the Z values, and that notifies the basin term within the ground. 00:26:51.000 --> 00:26:54.000 Motion, model, or we use the local Z one and Z. 2.5. 00:26:54.000 --> 00:27:07.000 That Brad just showed the maps of which are our candidate models and the main question we're trying to answer is, does using these local models better predict the observe ground motion compared to using the defaults within the ground motion models. 00:27:07.000 --> 00:27:11.000 So we did actually this study for Reno and Portland as well. 00:27:11.000 --> 00:27:14.000 But we're only gonna focus on the central valley today. 00:27:14.000 --> 00:27:32.000 And because these are ergotic, you know database ground modules are the basic terms that are published with these California heavy data sets in the Nga West projects sufficient for describing the ground model in in these different regions. 00:27:32.000 --> 00:27:47.000 We perform a residual analysis, residuals are just data minus model and log space, and these are within event, residual so we remove the effect of the total ground motion from the individual events that have their own trends. 00:27:47.000 --> 00:27:52.000 And there's there's this specific set of residuals is showing our site specific. 00:27:52.000 --> 00:28:05.000 Vs. 30. So that has the the Vs. 30. Scaling from within the brown motion model removed from the values that are plotted here, but with the default, Zx, so that means the basin terms are nullified. 00:28:05.000 --> 00:28:13.000 So we would predict we would anticipate to see a trend of basin terms with our predictive parameters. Z. 00:28:13.000 --> 00:28:17.000 One, and Z. 2.5, and we do see some trend. 00:28:17.000 --> 00:28:19.000 These are just 2 oscillator pairs that one in 5 s. 00:28:19.000 --> 00:28:22.000 Of course, the longer periods are imaging deeper sedimentary sequences, and 2 different ground motion models. 00:28:22.000 --> 00:28:48.000 Were used for the 2 different depth parameters of interest, and this last slide just shows the standard deviation of those residuals in the previous plot for the model predictions against the against the oscillator periods of interest and they kick in around half a second to 00:28:48.000 --> 00:28:55.000 Longest periods we do see some moderate reduction of the standard deviations in the model predictions. 00:28:55.000 --> 00:28:59.000 When we use the candidate local models of Z. One and Z. 00:28:59.000 --> 00:29:17.000 2.5 that Brad showed earlier, which leads us to the simple conclusion that if the models are performing better when we use this, then we should use them in the 2,023 national seismic hazard thank you 00:29:17.000 --> 00:29:22.000 Okay. Thanks everyone for all those great talks. So we're going to take another 5 min. 00:29:22.000 --> 00:29:26.000 Pause for questions for all of our speakers. 00:29:26.000 --> 00:29:39.000 Raise your virtual hand, and I can call it 00:29:39.000 --> 00:29:45.000 Oh, good! 00:29:45.000 --> 00:29:47.000 Log, one person. Go ahead. 00:29:47.000 --> 00:29:48.000 Oh, okay, thank you. Can you hear me? This is a question for Kevin. 00:29:48.000 --> 00:29:49.000 Yeah. 00:29:49.000 --> 00:29:59.000 For long. I'm wondering, what role does the San Andreas play in your model? 00:29:59.000 --> 00:30:07.000 Oh! Me! It has a repetitive history, large earthquakes, do we not have to worry about this anymore? 00:30:07.000 --> 00:30:08.000 How do you 00:30:08.000 --> 00:30:13.000 No, yeah. I mean all the faults that are active are active, false. 00:30:13.000 --> 00:30:18.000 The key thing we have is that over the time period that we have the geodesic data that we're looking at, none of those faults have had any major events on them. 00:30:18.000 --> 00:30:22.000 So they've been locked. So we have a nice elastic continuum across the upper 20 kilometers or so. 00:30:22.000 --> 00:30:32.000 So the GPS data is doing a great job of telling us where the deeper shear is. 00:30:32.000 --> 00:30:35.000 That is the lithospheric scale play, boundary. 00:30:35.000 --> 00:30:42.000 So the San Andreas still exists. It goes down, and it will terminate against, say, the Pacific plate. 00:30:42.000 --> 00:30:57.000 In the same way that the you know Calibera's fault will go down and doesn't join the thing, and this is something that we saw in the basics experiment and it's, you know well, imaged across the Bay area that there are multiple faults but the one that seems to go to 00:30:57.000 --> 00:31:02.000 the shear zone is the payword system. On the East Bay in the North. 00:31:02.000 --> 00:31:07.000 It's the makama system that overlies the deeper shear zone 00:31:07.000 --> 00:31:08.000 So so the San Andreas doesn't go down very deep. 00:31:08.000 --> 00:31:11.000 Is. 00:31:11.000 --> 00:31:17.000 Well, it goes to about 20 kilometers depth, which is where we image the top of the Pacific plate. 00:31:17.000 --> 00:31:20.000 And then it steps over to this more active boundary possible 00:31:20.000 --> 00:31:30.000 That's a very good question of what the nature of the of the connection between the base of any of these faults in California over to the main plate boundary shear zone. 00:31:30.000 --> 00:31:34.000 And so is it a seismically active system, a surface? 00:31:34.000 --> 00:31:39.000 Or is it a ductile surface that just creeps along at a current rate? 00:31:39.000 --> 00:31:52.000 Those are things I don't think we have the answers to in the same way, the concern of whether the detachment that connects the Sanford, the San Andreas fault to the Hayward fault beneath San Francisco Bay is that seismically active or not 00:31:52.000 --> 00:31:56.000 Thank you. 00:31:56.000 --> 00:31:57.000 Andy, go ahead 00:31:57.000 --> 00:32:12.000 Yeah, this is a question for Claire. I was wondering if you thought about looking for disturbed sentiments and any of the other water bodies like the You fake estuary, and some of the lagoons there that they went to the fault running right through them for the same 00:32:12.000 --> 00:32:20.000 sort of subsidence, maybe, but they might so provide confirmation of any timing 00:32:20.000 --> 00:32:21.000 I'm gonna assume. That's me, not the other. 00:32:21.000 --> 00:32:22.000 It is you. 00:32:22.000 --> 00:32:25.000 Claire? Yeah. Awesome. That's a great idea. 00:32:25.000 --> 00:32:30.000 And yeah, we've thought about it a little bit, but just haven't had the ability to go out there and and get some cores yet. 00:32:30.000 --> 00:32:31.000 But 00:32:31.000 --> 00:32:32.000 Yeah, was that any any excuse to do feel work up there, I'm sure, is a great great thing. 00:32:32.000 --> 00:32:35.000 So, yeah. 00:32:35.000 --> 00:32:37.000 Oh, yeah, I can't complain about the field area 00:32:37.000 --> 00:32:42.000 Okay. Thanks. 00:32:42.000 --> 00:32:48.000 Go ahead! 00:32:48.000 --> 00:32:49.000 You guys hear me? 00:32:49.000 --> 00:32:51.000 Yes. 00:32:51.000 --> 00:32:58.000 Hey! Everybody! Matt and Sean, when you guys were doing the default using the default computations for the vs. 00:32:58.000 --> 00:33:07.000 30, z. One relationship. This is primarily, for I guess I guess, for both models. 00:33:07.000 --> 00:33:23.000 Did you guys try to look at the at the trends between the Zx and the Delta Z, and compare them with the current model to see if there's any differences like just before, for the reason that you're that you're looking at like a great valley California regions did you 00:33:23.000 --> 00:33:29.000 Just did you plot those to see if there was any like significant difference between that and the regional model? 00:33:29.000 --> 00:33:34.000 The current channel, your money. 00:33:34.000 --> 00:33:39.000 There's no big differences. There's so much scattering that that the correlation is not strong it's it's really the bottom line. 00:33:39.000 --> 00:33:53.000 And and so you know, that's where you at hope that using the local models are gonna give you more accurate ground motions. 00:33:53.000 --> 00:33:55.000 Predictions. 00:33:55.000 --> 00:34:05.000 Okay. Awesome. Awesome. Thank you. 00:34:05.000 --> 00:34:07.000 Done 00:34:07.000 --> 00:34:18.000 Great thanks. Bell. A question for Claire du Walla Claire, I assume, like, if you might have said you were using plunge question for your sampling. 00:34:18.000 --> 00:34:26.000 I was just wondering, were you keeping your eyes out for anomalous sand deposits like the indicative of 00:34:26.000 --> 00:34:30.000 Yeah, we used fiber cores. And yes, that that is something. 00:34:30.000 --> 00:34:36.000 Also, we're we're kind of interested in and in the course, but also haven't gotten to kind of looking at those. 00:34:36.000 --> 00:34:40.000 But we do get some some interesting like little sand layers in there 00:34:40.000 --> 00:34:42.000 The. 00:34:42.000 --> 00:34:47.000 No problem. 00:34:47.000 --> 00:34:51.000 Another question from. I'm sorry. I don't know how to pronounce your name. I'm giving it a try 00:34:51.000 --> 00:35:01.000 No, no, that's okay. Most people just go with. It's a whole whole lot easier so don't worry, don't don't worry. 00:35:01.000 --> 00:35:07.000 The other question again, for Brad Sean, when particularly for Brat Brad, when you create what you do developed, the you were developing the the Us. 00:35:07.000 --> 00:35:16.000 Gs, area, Cvm the threed velocity model. Do you have a depth to basement layer for that model? 00:35:16.000 --> 00:35:22.000 Given that you guys, can you guys have the there? They probably is. 00:35:22.000 --> 00:35:28.000 Maybe some oil and well data in the Baker Street era. 00:35:28.000 --> 00:35:31.000 Do you have any kind of to straight on that anywhere in the model? 00:35:31.000 --> 00:35:34.000 I just just I Meanm just curious as 00:35:34.000 --> 00:35:38.000 So the model is based on 3D. Geology within the great valley. 00:35:38.000 --> 00:35:53.000 Most of it there is. It's a combination of data that Bob, Dakins put back, put together back in 2,006 there in the Bakersfield area. There's a lot of local data. 00:35:53.000 --> 00:35:55.000 Cgs is doing additional work in that area. But this particular version of the model is largely driven by gravity. 00:35:55.000 --> 00:36:05.000 But there are constraints on the, on the depth. In some areas 00:36:05.000 --> 00:36:13.000 Awesome. Awesome. And and if that's what you guys are, you guys are gathering data that that you guys gonna incorporate later in the in later versions of it. 00:36:13.000 --> 00:36:16.000 Or what what will be at all different renditionally. 00:36:16.000 --> 00:36:21.000 Maybe I'm I'm just thank you. Not loud. 00:36:21.000 --> 00:36:28.000 Yeah, we're trying to make it a community effort. So come to our philosophy model workshop in 2 weeks 00:36:28.000 --> 00:36:34.000 I will be there. See you there. Then, you guys. 00:36:34.000 --> 00:36:44.000 Okay, thanks everyone for those stimulating questions. We'll move on to the last block of thunder talks now 00:36:44.000 --> 00:36:55.000 Okay. Great. I'm John Louie. And thanks to a internship sources, internship, I've been able to work with Lauren Lou Wright. 00:36:55.000 --> 00:37:22.000 She's now getting her Phd. At mama, and we've used the same model as Eric Eckerts published Reno Shakeout Study, which was a a model, and we've tried to extend that to be not so Orergic and and modeled 00:37:22.000 --> 00:37:27.000 6 about magnitude. 3. Quakes that have occurred over the last 20 years. 00:37:27.000 --> 00:37:31.000 Now our AIM is to give our community some base in the amplification, spectra. 00:37:31.000 --> 00:37:39.000 The way that Frankel and Worth did for Seattle, and that work has set been a great benefit to that community. 00:37:39.000 --> 00:37:51.000 Now, of course, in the upper row. Here you see 2 examples of the individual shake maps, and there's all kinds of crazy directional effects. 00:37:51.000 --> 00:37:55.000 Very ergodic that you see focusing at F. 00:37:55.000 --> 00:38:04.000 Trapping at T base and edge effects at E. But we can take those 6 shake maps, and we can average them and compute their standard deviation in the average surprisingly enough, it be the basin depth. 00:38:04.000 --> 00:38:13.000 Even though it's small here in Reno less than a kilometer. 00:38:13.000 --> 00:38:17.000 Seems to have some correlation with the average shake map. 00:38:17.000 --> 00:38:31.000 There's also, of course, the effects of some very shallow events that asks the the standard deviation seems to show base and edge effects as well as the hello events. 00:38:31.000 --> 00:38:45.000 So in taking every basin station over Rock Station, you know, even though the deeper basin stations in Reno are really just a half a kilometer basin below them. 00:38:45.000 --> 00:38:55.000 They still get these, you know, factors of 4 or 5, you know, log 1.4 1.5. 00:38:55.000 --> 00:39:07.000 I amplifications, but the average amplifications are, you know, way less crazy than the individual amplifications? 00:39:07.000 --> 00:39:14.000 As you can see, the thin lines go all over the place, but the thick lines, the averages are starting to make some sense. 00:39:14.000 --> 00:39:17.000 The Basin Edge stations show more amplification. 00:39:17.000 --> 00:39:33.000 Near one Hertz, the summary, you know, average amplifications across the rock stations, you know, between the the point 3 and 0 point 9 hurts frequencies that we're using. 00:39:33.000 --> 00:39:47.000 And you know, actually, this is coming together, and and unlike these sort of nutty collections of limited amplifications we've had. 00:39:47.000 --> 00:40:09.000 Maybe the not ergodic log averages seem to provide something that could be predictable. And so we're gonna be pursuing that. Thank you very much. 00:40:09.000 --> 00:40:19.000 Hi! For my talk. We're focusing on. Can we use synthetic data as a supplement for hazard studies when there's a scarcity of earthquake recordings. 00:40:19.000 --> 00:40:24.000 I've been working with the University of Oregon which is where I'm a grad student as well as with ocean networks. 00:40:24.000 --> 00:40:38.000 Canada? On answering this question specifically from the position of earthquake early warning, using some stochastic data, and the cat in the Cascadia subduction zone to generate our synthetic data, we use a set of ford modeling codes from the github 00:40:38.000 --> 00:40:42.000 Repository Mudpie, which we also refer to as fate, creaks. 00:40:42.000 --> 00:40:48.000 We generated various cascadious abductions on rupture scenarios, using a oned velocity model and threed Cyclop structure. 00:40:48.000 --> 00:40:56.000 And those examples are shown here. On the left. We use these rupture scenarios to generate waveforms for a set of Gnss. 00:40:56.000 --> 00:40:59.000 And strong motion stations, which are all located on Vancouver Island. 00:40:59.000 --> 00:41:06.000 We generated the low frequency component of our waveforms deterministically using a oned velocity model in Green's functions. 00:41:06.000 --> 00:41:14.000 It generated the high frequency component of our waveform stochastically in the frequency domain, using models of the source, path and site. 00:41:14.000 --> 00:41:19.000 And then we can combine the loan high frequencies using a matched filter to obtain our full spectrum. 00:41:19.000 --> 00:41:27.000 New forms, the matched filter process is applied in the frequency domain, where the low frequency data are low-pass filtered. 00:41:27.000 --> 00:41:43.000 The high frequency data are high-pass filtered, and then the filter data are simply added together to get that full spectrum to validate our synthetic data, we compared characteristics of our synthetic waveforms with models that were developed using observed ground motion recordings one of the main things 00:41:43.000 --> 00:41:48.000 we looked at was to keep ground, motion, amplitude, so we looked at peak ground displacement and peak ground acceleration. 00:41:48.000 --> 00:42:00.000 We compared those peak amplitudes on the submarine waveforms with ground motion models so calculated, the residual, which is just simply the ground mushroom, all the prediction minus that synthetic peak ground amplitude, and we plotted these as a function of 00:42:00.000 --> 00:42:09.000 earthquake magnitude shown on the top, and as a function of the closest structure, distance shown on the bottom well, for all we find that our residuals are pretty reasonable. 00:42:09.000 --> 00:42:16.000 Most fall within about 2 natural log units. Would you find that our synthetic peak ground displacement is systematically lower than predicted? 00:42:16.000 --> 00:42:19.000 But we don't have any noise added to our synthetic data. 00:42:19.000 --> 00:42:31.000 Currently whereas gene assess data are usually pretty noisy for peak on acceleration, we find it to be fairly well modeled out to about 500 kilometers, after which you start to see this negative bias. 00:42:31.000 --> 00:42:35.000 Another parameter we looked at was the peak P wave displacement, amplitude. Which is this? 00:42:35.000 --> 00:42:37.000 Pd. Value. This is the amplitude of the P. 00:42:37.000 --> 00:42:46.000 Wave over the displacement waveform, and we compared this with a scaling law and found that I'm matches fairly well. 00:42:46.000 --> 00:43:01.000 And so in summary, we find our semi-stochastic sizedograms too much to expect to peak amplitudes, and p wave scaling. This is promising for earthquake early warning, but it may not be as sufficient for full threed, effects, or if you want more 00:43:01.000 --> 00:43:24.000 Sophisticated site response. So our future work involves just modifying our code to reduce some of the bias evaluating the spectra and then running our simulations both through the oncs or quick early warning algorithm as well as the shake of our system 00:43:24.000 --> 00:43:25.000 Alright! I'm Henry Bolton, the Usds. 00:43:25.000 --> 00:43:34.000 Sorry I don't have quite such a catchy title as Tara's, but I'm gonna show some work that I've been doing with Grace and Evan, and mostly Grace. 00:43:34.000 --> 00:43:49.000 Them have been doing the work to look at the directivity from this magnitude 5.1 Alan Rock earthquake that occurred last fall, and we're suggesting maybe it should be called the Hall Valley event because there was in 2,007 this other Alan Rock real quick. 00:43:49.000 --> 00:43:54.000 So the Shakemap on the right sort of shows that there was amplified ground motion to the south of the epicenter, and quite a few people have noticed that. 00:43:54.000 --> 00:44:04.000 So we wanted to take a closer look, see what we can see. So 00:44:04.000 --> 00:44:05.000 That go good alright, so great express simply plotted the ground motions for Pg. 00:44:05.000 --> 00:44:15.000 On the left, and Pgv. On the right, compared to a ground motion model, which is shown in the red dash curve colored by azimuth. 00:44:15.000 --> 00:44:25.000 So yellow dots are to the south, and the darker colors are to the north and we can see a really clear trend of larger ground motions to the south and depleted ground motions to the north. 00:44:25.000 --> 00:44:32.000 So this is a pretty clear indicator of directivity. In fact, it's rare that we see something so nice as this when just looking at the initial residuals. 00:44:32.000 --> 00:44:48.000 And this persists across all periods putting these same ground motion residuals on a map on the left still shows generally larger ground motions to the south, shown with the red triangles indicating directivity to the south, but also also shows some interesting amplification in San Francisco and other 00:44:48.000 --> 00:44:59.000 path and side effects. So don't understand which component of these observed ground motion residuals was purely from the source, source, grace, performed a novel, empirical greens function, decomposition. 00:44:59.000 --> 00:45:02.000 So this type of Egfd convolution is typically done for source modeling. 00:45:02.000 --> 00:45:10.000 But here she extended it into a ground motion analysis by modeling and removing path and side effects from several smaller co-located earthquakes. 00:45:10.000 --> 00:45:23.000 So now we can see even more clearly that there's self or directivity with almost all the red stations to the south, and almost all the stations to the north being blue and then we can wholly attribute this now to the source effect. 00:45:23.000 --> 00:45:33.000 So using those empirical grains functions adjusted residuals, I modeled the vote right directivity function, using the simple cosine function on the top left and on the left. 00:45:33.000 --> 00:45:38.000 Here is PGA and Pgb. Stereo nets, showing the rupture directivity in the arrow. 00:45:38.000 --> 00:45:50.000 So we see a very consistent rupture to the south, along the Calaveras, all to the south at rupture speed of about 55 to 64% of the shearway velocity for Pg. 00:45:50.000 --> 00:45:58.000 And Pgb. Other periods don't really consistent rupture strike, and we see slower rupture at the high frequencies and faster rupture at the longer periods. 00:45:58.000 --> 00:46:02.000 The last piece of the puzzle is done by Evan using kinematic rupture simulations. 00:46:02.000 --> 00:46:18.000 The same method he showed yesterday in the barrier velocity model so we're looking at 2 different sources of point source with the blue Wiggles and southward directed source in the Red Wiggles compared to the black observed ground 00:46:18.000 --> 00:46:35.000 Motions, and we can see that both the point source and unilateral south of rupture model the observed trace as well to the south, but to the north the point source greatly over predicts, and so this gives us another piece of evidence that there really was a southward directed rupture here 00:46:35.000 --> 00:46:39.000 So overall, we see that the alum rock shows southward along fault directivity. 00:46:39.000 --> 00:46:52.000 From these 5 different observations other people have seen this, such as Takaki and Doug, and 2,007, seem to also have rupture to the south, and Oppenheimer suggested that the Morgan Hill also ruptured to the south so are we seeing consistent software 00:46:52.000 --> 00:46:57.000 Directivity on this section of the Calvary are there implications here for hazard? 00:46:57.000 --> 00:46:58.000 Additionally on this 5.1 is perhaps on the smaller end of what we typically see for such strong directivity. 00:46:58.000 --> 00:47:07.000 So we need to remember that and take that into account. We're doing ground motion and other types of analysis. 00:47:07.000 --> 00:47:14.000 And lastly, just like to point out that Grace's novel, empirical green sponsors be convolution method for ground motion, to separate out these direct V effects. 00:47:14.000 --> 00:47:31.000 Thank you. 00:47:31.000 --> 00:47:48.000 Hey! Everybody! I'm gonna be talking to you guys about getting really accurate ground motion intensity measure estimates at locations between recording stations where we don't have actual records so I'm I'm Ken Hudson, I'm a Phd student at 00:47:48.000 --> 00:47:58.000 Ucla in the Civil Environmental engineering department. I work with Professors John Stuart and Scott Brandon were on this project, which is called the next Generation Liquid Action Project. 00:47:58.000 --> 00:48:17.000 It's a big multi institutional effort, that part of that we've now moved into a stage of a specific team that I'm on is working on creating new liquefaction, triggering and manifestation models and as a as a piece of that we wanted to 00:48:17.000 --> 00:48:36.000 Get more accurate intensity, measure estimates at our liquefaction case history sites which are usually not co-located with the recording station, and so it's this is important because they're they're the demand side of the equipment when we're 00:48:36.000 --> 00:48:37.000 regressing those liquid faction, triggering and consequence models. 00:48:37.000 --> 00:48:44.000 And so it's really desirable to have them be accurate and unbiased. 00:48:44.000 --> 00:48:54.000 And uncertainty levels that reflect the available information and that are consistently developed across sites and liquefaction databases. 00:48:54.000 --> 00:49:00.000 Many of the earthquakes that form the basis for current liquefaction, triggering relationships occurred prior to 1,999, and produced really limited numbers of ground motion records. 00:49:00.000 --> 00:49:21.000 But today our modern earthquakes in California, Japan, and other places produce large numbers of densely recorded ground motions which enable us to use spatial interpolation to interpret shaking intensities at size of interest. 00:49:21.000 --> 00:49:37.000 This is a method that uses Craiging to interpolate intensity measure within event, residuals to estimate ground motion, sec liquefaction sites, and by interpolating on the within event residuals of the intensity measures. 00:49:37.000 --> 00:49:38.000 It's more stable because we can remove systematic path. 00:49:38.000 --> 00:50:01.000 And first order. Side effects from the equation. So here's an example of the Loma creata earthquake showing the craig is the colored background, and how that, is influenced by the recording stations, and then we can use that to interpret what those within event residuals are at our 00:50:01.000 --> 00:50:12.000 sites of interest, and then and then finally, then we can get accurate estimates of things like PGA or other intensity measures. 00:50:12.000 --> 00:50:27.000 We can see that they can be pretty different from previous office estimates that these locations we now have this for about 17 different events, with a large number intestine measures, and we're looking to expand it even further. 00:50:27.000 --> 00:50:30.000 If you want to hear more, we're presenting a paper on it at Geo. 00:50:30.000 --> 00:50:46.000 Congress next month. Thank you. 00:50:46.000 --> 00:50:57.000 This is Miles wagon, I'm a retired emergency services officer with San Bernardino County Office of Emergency services and we're talking about how the Toronto Richfast earthquake helps Northern California. 00:50:57.000 --> 00:51:04.000 If this taught us new mapping techniques insar data revealed additional false and potential hazard zones. 00:51:04.000 --> 00:51:09.000 And these techniques are readily available after a significant earthquake 00:51:09.000 --> 00:51:18.000 68 ap zones existed in today's mapping methods added 797 falls using insar data. 00:51:18.000 --> 00:51:23.000 We located 3,455 county hazard zones with total movement exceeding 4,590. 00:51:23.000 --> 00:51:32.000 One faults in the study area 00:51:32.000 --> 00:51:41.000 This map shows the ruptures across the 1,373 square mile area the Salt Wells Valley, Paxton Ranch and Garlic Falls. 00:51:41.000 --> 00:51:46.000 All, all moved, and the ins are located. Faults shown on the map. 00:51:46.000 --> 00:51:51.000 Our index levels, one through 4 only 00:51:51.000 --> 00:51:55.000 In sorry fault. Mapping was done with no other visible data. 00:51:55.000 --> 00:52:01.000 This reduced data interpretation biased by any mappers, and the map was zoomed in from anywhere. 00:52:01.000 --> 00:52:06.000 From one up to 600% for clarity 00:52:06.000 --> 00:52:13.000 We developed the Wagner fault index to differentiate insar fault, quality, index level one through 3. 00:52:13.000 --> 00:52:18.000 Our county hazard zones similar to Ap index zone lab. 00:52:18.000 --> 00:52:28.000 4 depends upon field studies for clarification index levels 5 through 8 car acknowledgment and hazard potential only 00:52:28.000 --> 00:52:34.000 The study area is larger than the State of Rhode Island and start data shows. 00:52:34.000 --> 00:52:45.000 Both faults extend outside of the study area, insert data after July tenth is now being studied, but not shown insarfault mapping allows for better site. 00:52:45.000 --> 00:52:54.000 For wait a minute. Sorry about that insar fault. Mapping allows for better seismic hazard assessment. How's it? 00:52:54.000 --> 00:53:05.000 Just assessment is completed in weeks to months, not years, allowing community recovery to begin sooner with revised codes and changes 00:53:05.000 --> 00:53:31.000 Any additional information you can contact me at M. Wagner, 7 0 at Yup, or call me at the telephone number listed any questions 00:53:31.000 --> 00:53:45.000 Okay, so today I'll be talking about how we've been using remote sensing data sets to constrain probabilistic fold system, has a malls for strikes of events. 00:53:45.000 --> 00:53:46.000 Hmm. 00:53:46.000 --> 00:53:48.000 My name's Chris Miller. I'm a poster at Caltech 00:53:48.000 --> 00:54:06.000 So the first thing I want to show are are some results using an improved version of the optical, typical tracking technique that we'd be working on with AIM to more accurately measure the new feel quick size and service definition pattern using cellar images see here 00:54:06.000 --> 00:54:24.000 Comparing each Us. Is slicing field for the 9 sign is my earthquakes, using the old correlation approach on the left compared with the new method showing the right using spot to opt to seller images, we can significantly improve the single size ratio and quality of the result so 00:54:24.000 --> 00:54:34.000 You can see here, the new approach is effectively removing topographic artifacts and sensor array artifacts which we can use to better estimate the total displacement across the rupture zone. 00:54:34.000 --> 00:54:41.000 So this method Canal will submit and pattern is shown here for the risk. 00:54:41.000 --> 00:54:56.000 Earthquake using worldview images. So using this method, we applied it to measure the new field horizontal service deformation of about 17 historical strikes that on earthquakes no, all of it are shown here. 00:54:56.000 --> 00:54:59.000 They arranged 90 from about 6 to 7.9. 00:54:59.000 --> 00:55:08.000 That's fair compiled most of the events that we've processed, some of them being pretty using it's the offsets or radar amplitude data from his slice and mats. 00:55:08.000 --> 00:55:11.000 We can then measure the variation of total displacement, along the service rupture. 00:55:11.000 --> 00:55:19.000 This results in about 3,400 displacement measurements shown here on the bottom right figure. 00:55:19.000 --> 00:55:33.000 So here we compare our fold assessment measurements with the 2 dedicated data shown by the dark blue circles the field measurement shown in the magenta for a number of these strikes of events, this data form the basis for constraining our empirical analogic 00:55:33.000 --> 00:55:38.000 Fault system models where we try to characterize how the amplitude, displacement. 00:55:38.000 --> 00:55:47.000 There is a function distance along with service structure and with the mobile manager someone can see the field and do that like this decision profiles different 2 ways. 00:55:47.000 --> 00:55:51.000 The first is that the due dedicated are systematically larger than the field observations. 00:55:51.000 --> 00:55:58.000 The second is address, the field data systematically have a high degree of a long stripe variability. 00:55:58.000 --> 00:56:09.000 So from performing, a mix of facts. Regression on the the 2 data sets, we get a displacement minus 2 scaling relation, that's shown by the blue line which we assume that has a bilinear form if we compare this to a relation constrained by the field. 00:56:09.000 --> 00:56:15.000 Data showing the red. I was a number. We see number of differences. 00:56:15.000 --> 00:56:31.000 So the first is that, as expected to do data based relation play systematically larger displacements for give minuses and as a shallower slopes, the other important difference is that it certainly binds over an hour for the judetic relation from what the blue dash lines 00:56:31.000 --> 00:56:37.000 compared to the red, and that's because of the lower degree of variability of the long strike displacement. 00:56:37.000 --> 00:56:49.000 So lastly, with those probabilistic, most constrained, you can calculate how the curve, using our due to to data, which is some where the blue line for a scenario on full event, with a 19 5.8. 00:56:49.000 --> 00:56:57.000 This figure shows the result of a recent benchmarking size, one to Turkish, where we compared our has a curve estimate to those determined using field data. 00:56:57.000 --> 00:57:00.000 Those are shown by the green line, which is the peers in the towel. 00:57:00.000 --> 00:57:07.000 2011, and green, and the red curve showed the model, using the same formulation that we just use, but constrained using the fill observation. 00:57:07.000 --> 00:57:23.000 The 2 things take away from this are first looking at the blue line, we break the higher annual rate in exchange for smaller displacements compared to the hazard curves in terms of the field data that is again expecting because of due data displacement are larger than the flows observed 00:57:23.000 --> 00:57:29.000 in the field, the only main difference is that you can see the blue curve, a large displacements, the I know right. The students quickly diminishes and becomes less than that participant in the field. 00:57:29.000 --> 00:57:33.000 Data which is out of counterintuitive, but this can be explained by those numbers. 00:57:33.000 --> 00:57:42.000 Are uncertainty bands on the managing displacement, relations, and I just showed in the previous slide, so that's all the time I have thank you for listening. 00:57:42.000 --> 00:57:53.000 I'll look forward to any questions 00:57:53.000 --> 00:58:00.000 Alright. Well, thanks to all the speakers for a great round of talks, so now we have about. 00:58:00.000 --> 00:58:18.000 Oh, hopefully! 5 min again to last question. So is your virtual hands, please 00:58:18.000 --> 00:58:23.000 Bob. 00:58:23.000 --> 00:58:29.000 Yes, and this is a question for Annemarie. Actually, it's a comment, since we just had an earthquake. 00:58:29.000 --> 00:58:36.000 That had directivity at least alleged, and we had one in 1975. 00:58:36.000 --> 00:58:44.000 They went the other way. This would be a wonderful place to test your green functions out. 00:58:44.000 --> 00:58:47.000 I would think 00:58:47.000 --> 00:58:51.000 Yeah. Sorry. You're speaking of the the Ferndale. 00:58:51.000 --> 00:58:52.000 Yeah, the and what I showed in 1975. 00:58:52.000 --> 00:58:54.000 6 months. Yeah. 00:58:54.000 --> 00:59:04.000 It wasptured the opposite way, and so we we have valley effects, but the directivity looks like it. 00:59:04.000 --> 00:59:12.000 Increase the accelerations quite a bit from 75 to 2022. So that would be a wonderful place to test. 00:59:12.000 --> 00:59:20.000 Yeah. So I think Grace is working and Evan are both working there, and Grace showed on Tuesday. 00:59:20.000 --> 00:59:21.000 Lee. 00:59:21.000 --> 00:59:31.000 I guess her initial work on Ferndell trying to apply some of the same kind of methods, so to to try to unravel what components are can really be attributed to the directivity versus the amplification from the river basin sediments and other effects 00:59:31.000 --> 00:59:36.000 Nice. 00:59:36.000 --> 00:59:40.000 Eric. 00:59:40.000 --> 00:59:43.000 You're muted 00:59:43.000 --> 00:59:53.000 I wanted to ask Chris if he's had a look at the 1989 Loma Prieta earthquake, with his 00:59:53.000 --> 00:59:56.000 Yeah, that's a good question. Actually, I'd looking at that. 00:59:56.000 --> 01:00:08.000 Yes, I have been, but I was looking at it yesterday, after Carl's talk and the our our air images, error, arrow foot in graphs, photographs. 01:00:08.000 --> 01:00:13.000 So that is something we could do. The new technique, isn't. We? 01:00:13.000 --> 01:00:14.000 Have in implemented routines to process aerial photographs. 01:00:14.000 --> 01:00:20.000 Yeah. It's only been a depth for salad images. 01:00:20.000 --> 01:00:24.000 But that would be a good target to try definitely. Yeah. 01:00:24.000 --> 01:00:28.000 Yeah, we don't have to. Many satellite images from that far back 01:00:28.000 --> 01:00:30.000 Yeah, that's right, yeah, they're they're good. 01:00:30.000 --> 01:00:40.000 The high resolution, and we have the Calibration Report, so that, like, you know, we, it's possible we could see something 01:00:40.000 --> 01:00:42.000 Steve. 01:00:42.000 --> 01:00:43.000 Hey? This is for Chris Miller. Chris, great talk! 01:00:43.000 --> 01:00:44.000 It's good to see what you're working on. 01:00:44.000 --> 01:00:55.000 Those when you show that the optical versus field looks like a cross over magnitude 8, I assume that's just a artifact of extrapolation. 01:00:55.000 --> 01:01:00.000 There's no physical meaning that you actually would use. Suspect that should converge. And how magnitude, or or what do you think of that part of your platform 01:01:00.000 --> 01:01:04.000 Yeah. That's a good good point. Yeah, you're right. 01:01:04.000 --> 01:01:08.000 That is because you don't have any events larger than 7.9. 01:01:08.000 --> 01:01:13.000 So, and you know it's hard to say whether they they should convert. 01:01:13.000 --> 01:01:21.000 I think physically, you know they should, because typically we see those larger events are typical, or what's your faults? 01:01:21.000 --> 01:01:28.000 And so we see the field and G diet data. I agree more and more, as is less awful straight. So yeah, I would. 01:01:28.000 --> 01:01:35.000 I would expect physically those those 2 scaling relations, just to converge 01:01:35.000 --> 01:01:38.000 Yeah. 01:01:38.000 --> 01:01:44.000 Nice 01:01:44.000 --> 01:01:46.000 Frank. 01:01:46.000 --> 01:01:58.000 This question is for Miles. Miles, considering the amount of faulting that your technique is revealing from the Ridgecrest earthquake 01:01:58.000 --> 01:02:07.000 what do you think are the implications for other earthquakes that have been sard data like Landers and heck 01:02:07.000 --> 01:02:08.000 I have currently downloaded the data but I haven't had a chance to look at it. 01:02:08.000 --> 01:02:20.000 Yeah, I think it's going to surprise us and show that there's a lot more ground movement than people realized. 01:02:20.000 --> 01:02:25.000 And it's going to impact the way we develop AP zones. 01:02:25.000 --> 01:02:38.000 And it's going to have a real impact on development 01:02:38.000 --> 01:02:47.000 Okay. 01:02:47.000 --> 01:02:49.000 Okay. 01:02:49.000 --> 01:02:57.000 Any more outstanding questions. There's some ongoing discussion, and we chat about GPS. But 01:02:57.000 --> 01:03:03.000 Offline, perhaps. So. It's in order for us to have time for break. 01:03:03.000 --> 01:03:21.000 Well, end question session here. Thank you to all the Thunder talkers.